Thursday, December 21, 2017

Dear Black people, Please learn how to swim (and code)!



Dear Brothers and Sisters,

This is a letter of caution and encouragement.

On August 2nd, 2010, 15-year-old DeKendrix Warner was enjoying the cooling waters of the Red River outside of Shreveport, Louisiana. His family and some friends had gathered for a late-summer barbeque. Several of the groups teenagers were splashing and playing along the shallow bank, when suddenly DeKendrix slipped into a 25-foot deep drop off. He didn’t know how to swim. He yelled for help. Three of his cousins, and three of his friends jumped into action to save him. None of them knew how to swim. Not 18-year-old Litrelle Stewart, not 15-year-old LaTevin Stewart, not 17-year-old LeDarius Stewart, nor 13-year-old Takeitha Warner, nor 14-year-old JaMarcus Warner, and not 17-year-old JaTavious Warner; they all drowned. None of their parents knew how to swim; they watched helplessly as their children sank one-by-one beneath the river’s surface, thrashing and screaming for help.

A man named Christopher Patlan heard the screams and saw what was happening. He was close enough to the scene that he was able to dive into the water and save one child: DeKendrix.

The six children that perished, the parents who watched powerlessly as they died, and DeKendrix are Black.

Mr. Patlan is White and Hispanic.

According to the Centers for Disease Control, Black children ages 5 to 19 drown at 5.5 times the rate of White children. Beneath the devastation that resulted from the ostensibly preventable loss of young, precious lives, is a legacy of bondage, discrimination, and segregation.

And the history and legacy of disparity in water safety knowledge and experience is eerily parallel to computer science.

In Stuck in the Shallow End: Education, Race, and Computing, Researcher Jane Margolis et. al, explain how the relationship of grade school structures, course offerings, student-to-counselor ratios, and belief systems held by students, parents, and teachers, has led to Blacks and Latino/as receiving a disproportionately low number of undergraduate and advanced degrees in computer science (around 7% according to American Community Survey data). Ms. Margolis connects today’s inequalities in access to rigorous computer science education to the past when Black slaves were forbidden to learn how to swim for fear they would escape, and Jim Crow segregation that prevented Black people from using public pools where they would have learned and refined swimming and water safety skills.

Perhaps DeKendrix’s cousins and friends would still be present, if that past never was.

In closing, I encourage you to read Margolis’s work as it beautifully lays out the case for re-thinking and re-designing the way Black children are prepared to take their places in the rapidly and incessantly evolving global economy. All Black children will not and do not need to major in computer science or become programmers. But, the problem solving skills characteristic of a proper computer science education can go a long way toward helping young Brothers and Sisters move people of color toward first economic, then social, and thus political parity.

Sincerely,

Technology and the Brothers


Cullen Jones, Swimmer, Gold-Medalist, World Record Holder, Brother

Follow Me on Twitter: @TechAndDaBros

Thursday, December 7, 2017

I dated Bitcoin, but I married Blockchain




Bitcoin is a movement by itself, but it’s a force when combined with Blockchain!

Shot out to Ne-Yo!

By now, you’ve probably heard of Bitcoin. Whether you’ve been the unlucky victim of a Ransomware attack and been directed to pay up in Bitcoin, or if you’ve been looking to further diversify your investment portfolio, you’ve probably read or heard about the first “cryptocurrency” that is all the rage these days. As of this writing the, the cost of one Bitcoin is approaching $17,000 USD. I’m no finance expert, but I’ve recently taken the plunge and invested, or should I say traded, a few bucks for a few fractions of Bitcoin. Hopefully, I’ll be able to retire off of my investment/trade in another 12 to 18 months! But, if not, I’m still excited about Bitcoin and especially the technology that undergirds it called BLOCKCHAIN.

I’m not going to go into the nearly 10 year history of Bitcoin and Blockchain, there are plenty of web resources out there to fill you in:




I only want to convey my high interest and enthusiasm for these platforms. The Blockchain in particular has the potential to revolutionize human affairs in much the way the Internet did.

Particularly, Blockchain is defined as a:

"...ledger of facts, replicated across several computers assembled in a peer-to-peer network. Facts can be anything from monetary transactions to content signature. Members of the network are anonymous individuals called nodes. All communication inside the network takes advantage of cryptography to securely identify the sender and the receiver. When a node wants to add a fact to the ledger, a consensus forms in the network to determine where this fact should appear in the ledger; this consensus is called a block."

A decentralized, secure, and immutable electronic ledger can be used to verify information needed to complete transactions all over the world, for much less than current rates charged by banks and other financial intermediaries, and in much less time (minutes compared to hours, days, or even weeks). But the Blockchain is not just a financial technology tool. It is currently being tested to develop driverless vehicles, store government data, and host educational competitions.

Now, RIGHT NOW, is the time to get familiar with the way this technology works, and to acquire the skills (i.e.- JavaScript, cryptography, Linear Algebra) needed to develop applications that can leverage Blockchain.

Free (for now) Learning Resources and Information:





So what’s in this for the Brothers (and Sisters)?

I Googled “Blockchain and black people”, and came across two very enlightening and very thought-provoking articles:

Two years ago, at the Atlantic, Kyle Coward asked, “Why Are So Few Black People Using Bitcoin?

Ed Dunn, via Medium, seeks to clarify the definition of “decentralization” as it relates to Blockchain, and states that Brothers and Sisters are Blockchain pioneers.

Given the developing hysteria surrounding Bitcoin, and the limitless potential of application for Blockchain technology, there is only one appropriate action to take.

Like Spike Lee said...


***P.S.***

This YouTube video provides a good explanation of Blockchain, that literally anyone can grasp:





Follow Me on Twitter: @TechAndDaBros

Monday, November 27, 2017

The HOV Lane, Virtualization 101



When it comes to computing, think of virtualization like the high occupancy vehicle (HOV) lane of a major highway.





Virtualization is essentially the shared usage of computer resources among a large group of users. Like carpooling, in which commuters share in the cost of fuel, the goal of virtualization is to increase the efficiency of both the end users and the expensive computer resources they share.

“Where you goin’ fool!? Me too, hop in!!!”


Traditional servers (computers that literally serve files, data, and applications to other computers) consisted of a single, physical combination of processing, storage, and networking resources.


They ran one operating system, and most operated at just 5 to 15 percent of their total load capacity.
Virtualization enabled multiple operating systems to run on a single server.



Modern server resources tend to be very robust (think dozens of computer chips, trillions of bytes of memory and storage, and high-speed network connections). All of these resources are aggregated, pooled, and allocated to virtual machines, which are software versions of computers.



End users (me and you, yo mama and yo cousin too) connect to the files, folders, websites, and other programs installed on these virtual machines as if they were installed on physical machines.



The Main Ingredient: the HYPERVISOR (aka the Virtual Machine Monitor)

Virtualization's essential component is known as the hypervisor.
The smart people at IBM, who invented virtualization, called the hypervisor the virtual machine monitor (VMM for short).

The hypervisor (depicted below as the virtualization layer) is software installed on a computer or server that allows it to share its resources.



Hypervisors are classified as one of these two types:
Classification
Characteristics and Description
Type 1bare metal or native  
Bare metal (native) hypervisors are software systems that run directly on the host's hardware to control the hardware, and to monitor the guest operating systems. Consequently, the guest operating system runs on a separate level above the hypervisor.
Type 2hosted
Hosted hypervisors are designed to run within a traditional operating system. In other words, a hosted hypervisor adds a distinct software layer on top of the host operating system, and the guest operating system becomes a third software level above the hardware.
Source: Oracle






The major virtualization products (and vendors) you will encounter today are:



VMware and the vSphere hypervisor




Microsoft, which produces Hyper-V and Azure





Citrix and the Xen hypervisor







Red Hat, a Linux distribution that utilizes a hypervisor called KVM.



…and Amazon Web Services







I have administered and engineered some combination of these different technology solutions for over 10 years. I am certified on the VMware, Microsoft, and Citrix platforms. I have previously written about the importance of certifications, and particularly starting your career with CompTIA A+. According to the latest A+ exam objectives, here’s what you need to know about virtualization:



The focus appears to be on the Type 2, hosted hypervisor, so review the following:

Client-Side Virtualization



As you build knowledge and progress in your career, other relevant virtualization technologies will become more relevant (and prevalent). I encourage you to get familiar with them. They include:



~and~




Now that you understand a little bit about virtualization, let’s ride.

We in the HOV, I mean virtualization lane like…


two dope boyz aggregated in a Cadillac.


Follow Me on Twitter: @TechAndDaBros

Thursday, November 9, 2017

Happy Birthday Benjamin Banneker!



On this day in 1731, Benjamin Banneker was born free near Baltimore, Maryland. During his life, he became a renowned farmer, mathematician, and astronomer. As a teenager, he invented an irrigation system for his family’s farm, and at the age of 21, he constructed the first clock completely built in North America. The clock was made entirely of wood, and kept accurate time for 50 years.

Banneker’s most noted accomplishment is that of architect of Washington, D.C. Historians tell us that after a year of work, the French architect hired by George Washington to design the capital, quit and took all of the plans with him.  Banneker, who was placed on the development committee at Thomas Jefferson's request, saved the project by reproducing from memory a complete layout of the streets, parks, and major buildings.

Washington, D.C. itself is a monument to the genius of this great man.

Benjamin Banneker died in 1806 at the age of 75.

Follow Me on Twitter: @TechAndDaBros

Monday, November 6, 2017

Who invented the PC? A brother named Mark Dean.

Dr. Mark E. Dean
 IBM/AP Images 

The personal computer (or PC), is probably the most important invention worldwide in the past 40 years. The first machines that we recognize as computers were developed in the 1940s, evolved through the 1970s, and were called mainframes. The PC, designed for use by one person to do things like create spreadsheets at work or play games at home, was introduced in the 1980s.

In 1981, Mark Dean led a team of engineers at an IBM research facility in Boca Raton, Florida in giving the world a powerful, reliable, and most importantly, affordable computer system. Dean himself was responsible for developing the graphical interface, which led to color monitors. He also co-invented the ISA system bus, which allowed PC connections with interchangeable peripheral devices like printers and modems made by different vendors. By the end of the 1980s, PC sales would reach 100 million units per year.

Mark Dean and the IBM 5150 PC

But Dean didn’t stop there. In 1999, he led another group of IBM engineers in creating the first computer chip that could perform 1 billion calculations in 1 second. The Gigahertz CPU (Central Processing Unit) is almost standard today for any computing device.
 
Today, Mark Dean is a Distinguished Professor at the University of Tennessee’s College of Engineering. He is a member of the National Inventors Hall of Fame, he was the first Black IBM Fellow, and he holds three of the nine original patents related to the invention and development of the personal computer.

Earlier this year, NetworkWorld, asked Who should be on the Tech Mount Rushmore?
 
Here’s my opinion:




From left to right: Bill Gates, Mark Dean, Steve Jobs, and Philip Emeagwali (come back later for his story).








Follow Me on Twitter: @TechAndDaBros

Thursday, November 2, 2017

The A+ Certification & the Importance of Symbols

“…a word or an image is symbolic when it implies something more than its obvious and immediate meaning.” –Carl Gustav Jung, Man and His Symbols

This image changed my life:


It was created by the Computing Technology Industry Association (CompTIA).

CompTIA is the leading provider of vendor-neutral IT certifications in the world.

On May 4, 2008, that image, along with the CompTIA A+ certification that it symbolizes, announced to the world that I was in possession of the knowledge and skills necessary to fix and configure the most common computer technologies in use at that time.

It symbolized my competency as a Technologist.

I was fortunate to already be employed as a software support representative, but my career took off when I decided to get certified.

For me, the A+ certification led not only to more money, but to more responsibility, which lead to more skill development, which lead to more opportunities, which led to more money, and on and on and on, and this virtuous cycle continues today.

A+ will not guarantee you a job, but I’m recommending it, because it worked for me.

I didn’t study computer science in college.

I only had the desire to do better than I was doing.

So, for anyone interested in starting a career in IT support, the A+ validates computer support skills. You will learn how to fix, enhance, and eventually design computer systems. 

*Note: computer support is distinct from computer programming, which will be covered in future posts.

Search the internet, and you’ll find plenty of favor, and a good amount opposition regarding the A+ certification (and IT certifications in general).

Again, this is my advice, based on my experience. The A+ worked for me, and here is more evidence that it can work for you:






Honorable Mentions Based on Popularity 
Popularity











Certification







Salary
1


















 $79,877


















 $81,601


















 $83,945


















 $89,147



The A+ Certification requires two tests. Each costs a little over $200. That amount may be hard to come by. Trust me, I know. It was going to take me 3 to 6 months to save for the exam, but I was blessed to be able to borrow the money from one of my cousins. If you don’t have a benevolent kin, don’t worry, use the 3 to 6 months I believe it will take for you to prepare for the tests to save up.

Let’s assume that you currently make minimum wage, $7.25/hr. Let’s also assume that you work 40 hours a week. After taxes, you should bring home between $900 and $1000 a month. If you disciplined yourself to save $100 per month, it would take you only 4 months to save enough to cover the exam fees.

AND IF YOU DON’T THINK YOU CAN DO THAT, READ THIS:


(*Download your own copy here: The Richest Man in Babylon)


How to get started

If you can understand the following concepts, you can pass the A+ tests.

There are four main components of a modern computing device:


1.      The Central Processing Unit or CPU is the “brains”, or calculator of the computer. It processes data, and executes instructions or code.

2.      Main Memory, or Random Access Memory, or RAM temporarily stores data, remembers instructions and helps the CPU to execute code faster and more efficiently (More memory = faster computer)

3.      Storage/Hard Drive is for permanent data storage. Unlike RAM, the hard drive retains data even if computing device is turned off

4.      Not pictured, but very important is the Network Interface. This could be a “card” where an Ethernet cable is connected, or a wireless antenna.

Interaction with a computing device involves input and output. Several devices can be used to control that interaction.

Examples of input devices:
Keyboard, mouse, thumb drive, microphone, scanner

Examples of output devices
Monitor, printer, speakers

You will need to understand that computing devices accept connections from different types of devices based upon official, approved standards, on different types of interfaces, or connections. For example, internet connections commonly use Ethernet, and thumb drives are connected via universal serial bus or USB.

You need to understand the interactions between hardware (CPU, RAM, storage, and network), and software, the things that can only be accessed when the computing device is turned on. Especially important are the Basic Input/Output System, or BIOS that runs when a computing device is powered on or restarted. The BIOS initializes, or prepares the computing device for the operating system. The operating system, or OS, manages the computer hardware and software at the same time. When most people use a computer, the operating system is where the majority of input and output interactions take place. The OS presents software called programs, or applications to users. That software is executed and controlled by the user, the computer’s hardware, and input and output devices.

What to Study
The material on the test is constantly being updated. Therefore, you need to fill out a form on the CompTIA website that will give you access to the latest exam objectives. Click here.

Study material & guides
I STRONGLY recommend you study 3 to 6 months before testing. Again, we are talking about $400 here! I also recommend that get a hold of an old computer or two, and practice taking them apart and putting them back together. Add memory, reinstall the operating system, and change the BIOS and network settings, etc.

Use the following free training websites as guides:

Begin here:

Then use one of the following:


Or


I will go over some of the more difficult subjects in future posts.

Topics such as…
…virtualization…

…networking…
  

…and “cloud” computing…

I will also offer some “tried and true” tips for gaining employment.

Enough reading.

Here is a “To do” list:

1. Create study progress goals.
a. Ex. 30 minutes a day, Monday through Friday

2. PASS THE TEST!!!!

3. Get a job and excel on the job!

4. Create a career progression plan.
a.  Ex. Help Desk, Systems Administrator, Team Lead, Architect

5. Explore entrepreneurial opportunities.
a.  Look for ways to sell your services independently
b.  Create a team with like-minded individuals and start a corporation



“Man also produces SYMBOLS unconsciously and spontaneously, in the form of DREAMS.”
-Jung


Follow Me on Twitter: @TechAndDaBros