Monday, November 27, 2017

The HOV Lane, Virtualization 101



When it comes to computing, think of virtualization like the high occupancy vehicle (HOV) lane of a major highway.





Virtualization is essentially the shared usage of computer resources among a large group of users. Like carpooling, in which commuters share in the cost of fuel, the goal of virtualization is to increase the efficiency of both the end users and the expensive computer resources they share.

“Where you goin’ fool!? Me too, hop in!!!”


Traditional servers (computers that literally serve files, data, and applications to other computers) consisted of a single, physical combination of processing, storage, and networking resources.


They ran one operating system, and most operated at just 5 to 15 percent of their total load capacity.
Virtualization enabled multiple operating systems to run on a single server.



Modern server resources tend to be very robust (think dozens of computer chips, trillions of bytes of memory and storage, and high-speed network connections). All of these resources are aggregated, pooled, and allocated to virtual machines, which are software versions of computers.



End users (me and you, yo mama and yo cousin too) connect to the files, folders, websites, and other programs installed on these virtual machines as if they were installed on physical machines.



The Main Ingredient: the HYPERVISOR (aka the Virtual Machine Monitor)

Virtualization's essential component is known as the hypervisor.
The smart people at IBM, who invented virtualization, called the hypervisor the virtual machine monitor (VMM for short).

The hypervisor (depicted below as the virtualization layer) is software installed on a computer or server that allows it to share its resources.



Hypervisors are classified as one of these two types:
Classification
Characteristics and Description
Type 1bare metal or native  
Bare metal (native) hypervisors are software systems that run directly on the host's hardware to control the hardware, and to monitor the guest operating systems. Consequently, the guest operating system runs on a separate level above the hypervisor.
Type 2hosted
Hosted hypervisors are designed to run within a traditional operating system. In other words, a hosted hypervisor adds a distinct software layer on top of the host operating system, and the guest operating system becomes a third software level above the hardware.
Source: Oracle






The major virtualization products (and vendors) you will encounter today are:



VMware and the vSphere hypervisor




Microsoft, which produces Hyper-V and Azure





Citrix and the Xen hypervisor







Red Hat, a Linux distribution that utilizes a hypervisor called KVM.



…and Amazon Web Services







I have administered and engineered some combination of these different technology solutions for over 10 years. I am certified on the VMware, Microsoft, and Citrix platforms. I have previously written about the importance of certifications, and particularly starting your career with CompTIA A+. According to the latest A+ exam objectives, here’s what you need to know about virtualization:



The focus appears to be on the Type 2, hosted hypervisor, so review the following:

Client-Side Virtualization



As you build knowledge and progress in your career, other relevant virtualization technologies will become more relevant (and prevalent). I encourage you to get familiar with them. They include:



~and~




Now that you understand a little bit about virtualization, let’s ride.

We in the HOV, I mean virtualization lane like…


two dope boyz aggregated in a Cadillac.


Follow Me on Twitter: @TechAndDaBros

Thursday, November 9, 2017

Happy Birthday Benjamin Banneker!



On this day in 1731, Benjamin Banneker was born free near Baltimore, Maryland. During his life, he became a renowned farmer, mathematician, and astronomer. As a teenager, he invented an irrigation system for his family’s farm, and at the age of 21, he constructed the first clock completely built in North America. The clock was made entirely of wood, and kept accurate time for 50 years.

Banneker’s most noted accomplishment is that of architect of Washington, D.C. Historians tell us that after a year of work, the French architect hired by George Washington to design the capital, quit and took all of the plans with him.  Banneker, who was placed on the development committee at Thomas Jefferson's request, saved the project by reproducing from memory a complete layout of the streets, parks, and major buildings.

Washington, D.C. itself is a monument to the genius of this great man.

Benjamin Banneker died in 1806 at the age of 75.

Follow Me on Twitter: @TechAndDaBros

Monday, November 6, 2017

Who invented the PC? A brother named Mark Dean.

Dr. Mark E. Dean
 IBM/AP Images 

The personal computer (or PC), is probably the most important invention worldwide in the past 40 years. The first machines that we recognize as computers were developed in the 1940s, evolved through the 1970s, and were called mainframes. The PC, designed for use by one person to do things like create spreadsheets at work or play games at home, was introduced in the 1980s.

In 1981, Mark Dean led a team of engineers at an IBM research facility in Boca Raton, Florida in giving the world a powerful, reliable, and most importantly, affordable computer system. Dean himself was responsible for developing the graphical interface, which led to color monitors. He also co-invented the ISA system bus, which allowed PC connections with interchangeable peripheral devices like printers and modems made by different vendors. By the end of the 1980s, PC sales would reach 100 million units per year.

Mark Dean and the IBM 5150 PC

But Dean didn’t stop there. In 1999, he led another group of IBM engineers in creating the first computer chip that could perform 1 billion calculations in 1 second. The Gigahertz CPU (Central Processing Unit) is almost standard today for any computing device.
 
Today, Mark Dean is a Distinguished Professor at the University of Tennessee’s College of Engineering. He is a member of the National Inventors Hall of Fame, he was the first Black IBM Fellow, and he holds three of the nine original patents related to the invention and development of the personal computer.

Earlier this year, NetworkWorld, asked Who should be on the Tech Mount Rushmore?
 
Here’s my opinion:




From left to right: Bill Gates, Mark Dean, Steve Jobs, and Philip Emeagwali (come back later for his story).








Follow Me on Twitter: @TechAndDaBros

Thursday, November 2, 2017

The A+ Certification & the Importance of Symbols

“…a word or an image is symbolic when it implies something more than its obvious and immediate meaning.” –Carl Gustav Jung, Man and His Symbols

This image changed my life:


It was created by the Computing Technology Industry Association (CompTIA).

CompTIA is the leading provider of vendor-neutral IT certifications in the world.

On May 4, 2008, that image, along with the CompTIA A+ certification that it symbolizes, announced to the world that I was in possession of the knowledge and skills necessary to fix and configure the most common computer technologies in use at that time.

It symbolized my competency as a Technologist.

I was fortunate to already be employed as a software support representative, but my career took off when I decided to get certified.

For me, the A+ certification led not only to more money, but to more responsibility, which lead to more skill development, which lead to more opportunities, which led to more money, and on and on and on, and this virtuous cycle continues today.

A+ will not guarantee you a job, but I’m recommending it, because it worked for me.

I didn’t study computer science in college.

I only had the desire to do better than I was doing.

So, for anyone interested in starting a career in IT support, the A+ validates computer support skills. You will learn how to fix, enhance, and eventually design computer systems. 

*Note: computer support is distinct from computer programming, which will be covered in future posts.

Search the internet, and you’ll find plenty of favor, and a good amount opposition regarding the A+ certification (and IT certifications in general).

Again, this is my advice, based on my experience. The A+ worked for me, and here is more evidence that it can work for you:






Honorable Mentions Based on Popularity 
Popularity











Certification







Salary
1


















 $79,877


















 $81,601


















 $83,945


















 $89,147



The A+ Certification requires two tests. Each costs a little over $200. That amount may be hard to come by. Trust me, I know. It was going to take me 3 to 6 months to save for the exam, but I was blessed to be able to borrow the money from one of my cousins. If you don’t have a benevolent kin, don’t worry, use the 3 to 6 months I believe it will take for you to prepare for the tests to save up.

Let’s assume that you currently make minimum wage, $7.25/hr. Let’s also assume that you work 40 hours a week. After taxes, you should bring home between $900 and $1000 a month. If you disciplined yourself to save $100 per month, it would take you only 4 months to save enough to cover the exam fees.

AND IF YOU DON’T THINK YOU CAN DO THAT, READ THIS:


(*Download your own copy here: The Richest Man in Babylon)


How to get started

If you can understand the following concepts, you can pass the A+ tests.

There are four main components of a modern computing device:


1.      The Central Processing Unit or CPU is the “brains”, or calculator of the computer. It processes data, and executes instructions or code.

2.      Main Memory, or Random Access Memory, or RAM temporarily stores data, remembers instructions and helps the CPU to execute code faster and more efficiently (More memory = faster computer)

3.      Storage/Hard Drive is for permanent data storage. Unlike RAM, the hard drive retains data even if computing device is turned off

4.      Not pictured, but very important is the Network Interface. This could be a “card” where an Ethernet cable is connected, or a wireless antenna.

Interaction with a computing device involves input and output. Several devices can be used to control that interaction.

Examples of input devices:
Keyboard, mouse, thumb drive, microphone, scanner

Examples of output devices
Monitor, printer, speakers

You will need to understand that computing devices accept connections from different types of devices based upon official, approved standards, on different types of interfaces, or connections. For example, internet connections commonly use Ethernet, and thumb drives are connected via universal serial bus or USB.

You need to understand the interactions between hardware (CPU, RAM, storage, and network), and software, the things that can only be accessed when the computing device is turned on. Especially important are the Basic Input/Output System, or BIOS that runs when a computing device is powered on or restarted. The BIOS initializes, or prepares the computing device for the operating system. The operating system, or OS, manages the computer hardware and software at the same time. When most people use a computer, the operating system is where the majority of input and output interactions take place. The OS presents software called programs, or applications to users. That software is executed and controlled by the user, the computer’s hardware, and input and output devices.

What to Study
The material on the test is constantly being updated. Therefore, you need to fill out a form on the CompTIA website that will give you access to the latest exam objectives. Click here.

Study material & guides
I STRONGLY recommend you study 3 to 6 months before testing. Again, we are talking about $400 here! I also recommend that get a hold of an old computer or two, and practice taking them apart and putting them back together. Add memory, reinstall the operating system, and change the BIOS and network settings, etc.

Use the following free training websites as guides:

Begin here:

Then use one of the following:


Or


I will go over some of the more difficult subjects in future posts.

Topics such as…
…virtualization…

…networking…
  

…and “cloud” computing…

I will also offer some “tried and true” tips for gaining employment.

Enough reading.

Here is a “To do” list:

1. Create study progress goals.
a. Ex. 30 minutes a day, Monday through Friday

2. PASS THE TEST!!!!

3. Get a job and excel on the job!

4. Create a career progression plan.
a.  Ex. Help Desk, Systems Administrator, Team Lead, Architect

5. Explore entrepreneurial opportunities.
a.  Look for ways to sell your services independently
b.  Create a team with like-minded individuals and start a corporation



“Man also produces SYMBOLS unconsciously and spontaneously, in the form of DREAMS.”
-Jung


Follow Me on Twitter: @TechAndDaBros