Monday, January 29, 2018

DO YOU REMEMBER THE TURTLE?


I was lucky enough to be chosen to participate in the “Gifted” program at Fowler Drive Elementary (Athens, GA) when I was in the 1st grade. The mid-80s are a blur now, but I remember one exciting morning vividly.

As was customary, a small group of students and myself were pulled from each of the 1st grade classrooms located in the Earth Shelter a couple hours into the school day. We were assembled in the hallway, and instructed by our teacher (whose name I cannot remember) to take our places in front of one of four or five computers. They were Apple IIs:


We were challenged to make the “Turtle” (a triangle-shaped cursor) move around the screen. We learned to make the "Turtle" advance by typing the word “FORWARD” followed by a number. For example:

FORWARD 50

This caused the “Turtle” to “walk” forward 50 steps (centimeters I think). I found it fascinating that he left a trail behind courtesy of the pen tied to his tail!

The commands, FORWARD and BACK, changed the “Turtle's” place on the screen, while LEFT and RIGHT changed the direction in which he faced in degrees. For example:

LEFT 90

Tells the “Turtle” to rotate 90 degrees to his left.

Throughout that morning, we learned to combine commands and program the “Turtle” to create shapes like squares, circles, and even flowers…

…similar to these images (courtesy of AnimaliaLife and SydLexia)




The “Turtle” was the graphical representation of a programming language called LOGO. Invented at MIT in 1967, LOGO was the brainchild of South African mathematician Seymour Papert, and computer scientist and artificial intelligence researcher Cynthia Solomon. The duo imagined a computer in every classroom, for every child at a time when almost no one entertained such ideas for logistical and financial reasons. Papert and Solomon persisted, and LOGO developed into a valuable tool for teaching programming to children and promoting computational thinking in early childhood education curriculum.

I strongly encourage you to explore the following related links:

20 things to do with a Computer

Seymor Papert

Cynthia Solomon’s LOGO

A “Wired” article about LOGO

LOGO history, via MIT

Syd Lexia’s LOGO

Math Education Sites



LOGO in action:



Follow Me on Twitter: @TechAndDaBros

Monday, January 22, 2018

WHO INVENTED THE INTERNET? A BROTHER NAMED PHIL.



"The Internet as we know it today did not cross my mind. I was hypothesizing a planetary-sized supercomputer and, broadly speaking, my focus was on how the present creates the future and how our image of the future inspires the present." –Dr. Philip Emeagwali

Precocious. Intelligent. Gifted. Genius. Those are some of my favorite words. And when it comes to technology, and the people behind great advancements in technology, you often here those words used in descriptions of said people. Philip Emeagwali was a precocious child. He is an intelligent, gifted, genius.

Born in 1954 in Akure, Nigeria, Emeagwali came of age during the Nigerian CivilWar. He spent two years in a refugee camp along with his family until, at the age of 14, he joined in the fight for Biafran independence. Even as a child soldier, Emeagwali’s aptitude for numbers was recognizable. At the age of 15, he earned the nickname “Calculus” after publishing a book on “infinitesimal calculus” and constructing a “calculo analog computer”.

Despite war’s devastation, and despite having to drop out of school due to his father’s inability to pay for his education, young Philip continued to study on his own. He eventually completed a general education certificate from the University of London, was awarded a 4-year scholarship to Oregon State University, then earned two master’s degrees, and a doctorate in scientific computing from the University of Michigan. 

He conceived and invented the internet while at Michigan.

As a native of oil-rich Nigeria, Emeagwali understood how oil was drilled. That knowledge and experience was invaluable in the debates on the Ann Arbor campus regarding how to use computers to simulate the detection of oil reservoirs.The methodology was the subject of his dissertation. Based on an idea about predicting the weather he remembered from a science fiction story, and inspired by the construction of honeycombs, Emeagwali concluded it would be better to use the many thousands of microprocessors of smaller computers to perform computations instead of expensive supercomputers. He sought and gained permission to use the “Connection Machine” at Los Alamos National Laboratory. This machine was designed to run over 65,000 interconnected microprocessors, and was sitting unused because no one had figured out how to program it properly. That is, until a brother named Phil showed up.

In 1989, Emeagwali connected remotely to the Connection Machine and programmed it to use approximately 65,000 globally dispersed microprocessors to accurately calculate the amount of oil in a simulated reservoir. Emeagwali’s Hyperball International Network was the original World Wide Web.

No man (or woman) is an island, therefore I sincerely recognize the great works of Sir Tim Berners-Lee, Robert Khan, and Vinton Cerf.

BUT WHEN IT COMES TO 28 YEAR-OLD INTERNET, PHILIP EMEAGWALI….



Follow Me on Twitter: @TechAndDaBros

Thursday, December 21, 2017

Dear Black people, Please learn how to swim (and code)!



Dear Brothers and Sisters,

This is a letter of caution and encouragement.

On August 2nd, 2010, 15-year-old DeKendrix Warner was enjoying the cooling waters of the Red River outside of Shreveport, Louisiana. His family and some friends had gathered for a late-summer barbeque. Several of the groups teenagers were splashing and playing along the shallow bank, when suddenly DeKendrix slipped into a 25-foot deep drop off. He didn’t know how to swim. He yelled for help. Three of his cousins, and three of his friends jumped into action to save him. None of them knew how to swim. Not 18-year-old Litrelle Stewart, not 15-year-old LaTevin Stewart, not 17-year-old LeDarius Stewart, nor 13-year-old Takeitha Warner, nor 14-year-old JaMarcus Warner, and not 17-year-old JaTavious Warner; they all drowned. None of their parents knew how to swim; they watched helplessly as their children sank one-by-one beneath the river’s surface, thrashing and screaming for help.

A man named Christopher Patlan heard the screams and saw what was happening. He was close enough to the scene that he was able to dive into the water and save one child: DeKendrix.

The six children that perished, the parents who watched powerlessly as they died, and DeKendrix are Black.

Mr. Patlan is White and Hispanic.

According to the Centers for Disease Control, Black children ages 5 to 19 drown at 5.5 times the rate of White children. Beneath the devastation that resulted from the ostensibly preventable loss of young, precious lives, is a legacy of bondage, discrimination, and segregation.

And the history and legacy of disparity in water safety knowledge and experience is eerily parallel to computer science.

In Stuck in the Shallow End: Education, Race, and Computing, Researcher Jane Margolis et. al, explain how the relationship of grade school structures, course offerings, student-to-counselor ratios, and belief systems held by students, parents, and teachers, has led to Blacks and Latino/as receiving a disproportionately low number of undergraduate and advanced degrees in computer science (around 7% according to American Community Survey data). Ms. Margolis connects today’s inequalities in access to rigorous computer science education to the past when Black slaves were forbidden to learn how to swim for fear they would escape, and Jim Crow segregation that prevented Black people from using public pools where they would have learned and refined swimming and water safety skills.

Perhaps DeKendrix’s cousins and friends would still be present, if that past never was.

In closing, I encourage you to read Margolis’s work as it beautifully lays out the case for re-thinking and re-designing the way Black children are prepared to take their places in the rapidly and incessantly evolving global economy. All Black children will not and do not need to major in computer science or become programmers. But, the problem solving skills characteristic of a proper computer science education can go a long way toward helping young Brothers and Sisters move people of color toward first economic, then social, and thus political parity.

Sincerely,

Technology and the Brothers


Cullen Jones, Swimmer, Gold-Medalist, World Record Holder, Brother

Follow Me on Twitter: @TechAndDaBros

Thursday, December 7, 2017

I dated Bitcoin, but I married Blockchain




Bitcoin is a movement by itself, but it’s a force when combined with Blockchain!

Shot out to Ne-Yo!

By now, you’ve probably heard of Bitcoin. Whether you’ve been the unlucky victim of a Ransomware attack and been directed to pay up in Bitcoin, or if you’ve been looking to further diversify your investment portfolio, you’ve probably read or heard about the first “cryptocurrency” that is all the rage these days. As of this writing the, the cost of one Bitcoin is approaching $17,000 USD. I’m no finance expert, but I’ve recently taken the plunge and invested, or should I say traded, a few bucks for a few fractions of Bitcoin. Hopefully, I’ll be able to retire off of my investment/trade in another 12 to 18 months! But, if not, I’m still excited about Bitcoin and especially the technology that undergirds it called BLOCKCHAIN.

I’m not going to go into the nearly 10 year history of Bitcoin and Blockchain, there are plenty of web resources out there to fill you in:




I only want to convey my high interest and enthusiasm for these platforms. The Blockchain in particular has the potential to revolutionize human affairs in much the way the Internet did.

Particularly, Blockchain is defined as a:

"...ledger of facts, replicated across several computers assembled in a peer-to-peer network. Facts can be anything from monetary transactions to content signature. Members of the network are anonymous individuals called nodes. All communication inside the network takes advantage of cryptography to securely identify the sender and the receiver. When a node wants to add a fact to the ledger, a consensus forms in the network to determine where this fact should appear in the ledger; this consensus is called a block."

A decentralized, secure, and immutable electronic ledger can be used to verify information needed to complete transactions all over the world, for much less than current rates charged by banks and other financial intermediaries, and in much less time (minutes compared to hours, days, or even weeks). But the Blockchain is not just a financial technology tool. It is currently being tested to develop driverless vehicles, store government data, and host educational competitions.

Now, RIGHT NOW, is the time to get familiar with the way this technology works, and to acquire the skills (i.e.- JavaScript, cryptography, Linear Algebra) needed to develop applications that can leverage Blockchain.

Free (for now) Learning Resources and Information:





So what’s in this for the Brothers (and Sisters)?

I Googled “Blockchain and black people”, and came across two very enlightening and very thought-provoking articles:

Two years ago, at the Atlantic, Kyle Coward asked, “Why Are So Few Black People Using Bitcoin?

Ed Dunn, via Medium, seeks to clarify the definition of “decentralization” as it relates to Blockchain, and states that Brothers and Sisters are Blockchain pioneers.

Given the developing hysteria surrounding Bitcoin, and the limitless potential of application for Blockchain technology, there is only one appropriate action to take.

Like Spike Lee said...


***P.S.***

This YouTube video provides a good explanation of Blockchain, that literally anyone can grasp:





Follow Me on Twitter: @TechAndDaBros

Monday, November 27, 2017

The HOV Lane, Virtualization 101



When it comes to computing, think of virtualization like the high occupancy vehicle (HOV) lane of a major highway.





Virtualization is essentially the shared usage of computer resources among a large group of users. Like carpooling, in which commuters share in the cost of fuel, the goal of virtualization is to increase the efficiency of both the end users and the expensive computer resources they share.

“Where you goin’ fool!? Me too, hop in!!!”


Traditional servers (computers that literally serve files, data, and applications to other computers) consisted of a single, physical combination of processing, storage, and networking resources.


They ran one operating system, and most operated at just 5 to 15 percent of their total load capacity.
Virtualization enabled multiple operating systems to run on a single server.



Modern server resources tend to be very robust (think dozens of computer chips, trillions of bytes of memory and storage, and high-speed network connections). All of these resources are aggregated, pooled, and allocated to virtual machines, which are software versions of computers.



End users (me and you, yo mama and yo cousin too) connect to the files, folders, websites, and other programs installed on these virtual machines as if they were installed on physical machines.



The Main Ingredient: the HYPERVISOR (aka the Virtual Machine Monitor)

Virtualization's essential component is known as the hypervisor.
The smart people at IBM, who invented virtualization, called the hypervisor the virtual machine monitor (VMM for short).

The hypervisor (depicted below as the virtualization layer) is software installed on a computer or server that allows it to share its resources.



Hypervisors are classified as one of these two types:
Classification
Characteristics and Description
Type 1bare metal or native  
Bare metal (native) hypervisors are software systems that run directly on the host's hardware to control the hardware, and to monitor the guest operating systems. Consequently, the guest operating system runs on a separate level above the hypervisor.
Type 2hosted
Hosted hypervisors are designed to run within a traditional operating system. In other words, a hosted hypervisor adds a distinct software layer on top of the host operating system, and the guest operating system becomes a third software level above the hardware.
Source: Oracle






The major virtualization products (and vendors) you will encounter today are:



VMware and the vSphere hypervisor




Microsoft, which produces Hyper-V and Azure





Citrix and the Xen hypervisor







Red Hat, a Linux distribution that utilizes a hypervisor called KVM.



…and Amazon Web Services







I have administered and engineered some combination of these different technology solutions for over 10 years. I am certified on the VMware, Microsoft, and Citrix platforms. I have previously written about the importance of certifications, and particularly starting your career with CompTIA A+. According to the latest A+ exam objectives, here’s what you need to know about virtualization:



The focus appears to be on the Type 2, hosted hypervisor, so review the following:

Client-Side Virtualization



As you build knowledge and progress in your career, other relevant virtualization technologies will become more relevant (and prevalent). I encourage you to get familiar with them. They include:



~and~




Now that you understand a little bit about virtualization, let’s ride.

We in the HOV, I mean virtualization lane like…


two dope boyz aggregated in a Cadillac.


Follow Me on Twitter: @TechAndDaBros

Thursday, November 9, 2017

Happy Birthday Benjamin Banneker!



On this day in 1731, Benjamin Banneker was born free near Baltimore, Maryland. During his life, he became a renowned farmer, mathematician, and astronomer. As a teenager, he invented an irrigation system for his family’s farm, and at the age of 21, he constructed the first clock completely built in North America. The clock was made entirely of wood, and kept accurate time for 50 years.

Banneker’s most noted accomplishment is that of architect of Washington, D.C. Historians tell us that after a year of work, the French architect hired by George Washington to design the capital, quit and took all of the plans with him.  Banneker, who was placed on the development committee at Thomas Jefferson's request, saved the project by reproducing from memory a complete layout of the streets, parks, and major buildings.

Washington, D.C. itself is a monument to the genius of this great man.

Benjamin Banneker died in 1806 at the age of 75.

Follow Me on Twitter: @TechAndDaBros

Monday, November 6, 2017

Who invented the PC? A brother named Mark Dean.

Dr. Mark E. Dean
 IBM/AP Images 

The personal computer (or PC), is probably the most important invention worldwide in the past 40 years. The first machines that we recognize as computers were developed in the 1940s, evolved through the 1970s, and were called mainframes. The PC, designed for use by one person to do things like create spreadsheets at work or play games at home, was introduced in the 1980s.

In 1981, Mark Dean led a team of engineers at an IBM research facility in Boca Raton, Florida in giving the world a powerful, reliable, and most importantly, affordable computer system. Dean himself was responsible for developing the graphical interface, which led to color monitors. He also co-invented the ISA system bus, which allowed PC connections with interchangeable peripheral devices like printers and modems made by different vendors. By the end of the 1980s, PC sales would reach 100 million units per year.

Mark Dean and the IBM 5150 PC

But Dean didn’t stop there. In 1999, he led another group of IBM engineers in creating the first computer chip that could perform 1 billion calculations in 1 second. The Gigahertz CPU (Central Processing Unit) is almost standard today for any computing device.
 
Today, Mark Dean is a Distinguished Professor at the University of Tennessee’s College of Engineering. He is a member of the National Inventors Hall of Fame, he was the first Black IBM Fellow, and he holds three of the nine original patents related to the invention and development of the personal computer.

Earlier this year, NetworkWorld, asked Who should be on the Tech Mount Rushmore?
 
Here’s my opinion:




From left to right: Bill Gates, Mark Dean, Steve Jobs, and Philip Emeagwali (come back later for his story).








Follow Me on Twitter: @TechAndDaBros