Friday, May 3, 2024

Community College Engineering Student Transfer

Yesterday I checked in via LinkedIn with a Holyoke Community College Engineering program
graduate who transferred to a nationally ranked top ten engineering university. The student is studying Electrical Engineering there and I asked how things were going. Here’s a screen shot of the response I got with identification information removed – including student name and the transfer university. Pretty cool!

The student compliments my two classes (Circuits 1 and Circuits 2) but there is so much more. Both classes are Calculus and Differential Equations based so the students need to really know their math stuff before I get them. The math, physics and chemistry instruction is  exceptional at Holyoke Community College – as it is at so many other community colleges in the country.  It is not just the STEM classes that prepare students for my classes though. To get their degree our students need to take additional courses including English Composition, History, Social Sciences and in some cases Business courses. These courses are critical, complementing the technical knowledge, skills and abilities gained in engineering courses, producing well-rounded professionals capable of addressing complex challenges with creativity, empathy, and ethical awareness.

 

I see it every day with students coming to my classes prepared to learn, solve problems, communicate and understand some pretty complex stuff. Amazing faculty doing amazing things in their classrooms makes it pretty easy for me to teach those classes.


We (community colleges) often face unjust criticism due to misconceptions. Despite offering quality education, we’re sometimes seen as inferior to four-year institutions. We provide valuable opportunities and options with smaller classes, dedicated faculty, and affordable tuition. And let's not forget transfer to four year institutions.

 

Thanks to the unnamed student – you certainly made the day!

Monday, April 29, 2024

Distributed Inference And Tesla With Some SETI Nostalgia


In this post, I’m setting aside any political stuff and focusing solely on tech.

 

In recent months, the electric vehicle (EV) market has seen a decline, marked by falling sales and an increase in unsold inventory. Tesla, in particular, has received a significant share of negative attention. During Tesla's first-quarter earnings call last week, Elon Musk diverged from the norm by highlighting Tesla's broader identity beyond its role in the automotive industry. He emphasized the company's engagement in artificial intelligence and robotics, suggesting that pigeonholing Tesla solely within the EV sector overlooks its broader potential.
 

Musk's suggestion to actively utilize Tesla's computational power hints at a larger strategic vision. He envisions a future where idle Tesla vehicles contribute to a distributed network for AI model processing, termed distributed inference. This concept could leverage the collective computational strength of millions of Tesla cars worldwide, extending the company's impact beyond transportation.

 

Very interesting – I drive maybe 1-2 hours per day, the rest of the time my car is not being used. What if all that computing horsepower could be used while I’m not using it? Musk’s concept brings up memories of the sunsetted SETI@home computer application. SETI was a distributed computing project that allowed volunteers to contribute their idle computer processing power to analyze radio signals from space in the search for extraterrestrial intelligence (SETI). SETI@home used data collected by the Arecibo Observatory in Puerto Rico and the Green Bank Telescope in West Virginia to search for patterns or anomalies that could indicate the presence of intelligent alien civilizations.

 

Participants in SETI@home downloaded a screensaver or software client onto their computers, which would then process small segments of radio telescope data during periods of inactivity. The processed data would be sent back to the project's servers for analysis. By harnessing the collective power of millions of volunteer computers around the world, SETI@home was able to perform computations on an unprecedented scale. The project was launched in 1999 by the University of California, Berkeley, and it quickly became one of the largest distributed computing projects in history. Although the original SETI@home project ended in 2020, its legacy lives on as an example of the power of distributed computing and the widespread public interest in the search for extraterrestrial life.

 

Musk's vision underscores Tesla's potential to revolutionize not only the automotive sector but also broader domains such as artificial intelligence and robotics. It signifies a strategic shift towards leveraging Tesla's resources and expertise in a SETI-like way to drive innovation and create value in new and unexpected ways.

Friday, April 26, 2024

Communications, Networking Methods & Protocols: Introduction and the Information Asset

Terry Pardoe and I wrote an unpublished text titled Data Communications, Networking Methods and Protocols book 20 years ago. Terry passed away on May 2, 2016 at the age of 76. Over this summer I’ll be posting content from that unpublished book here in honor and respect of Terry. It is interesting – 20 years later - a combination of some obsolete but other still relevant technologies. Here’s the first post from the first chapter.

 

The creation and introduction of the binary digital computer into the world of information collection, processing and distribution has brought with it massive expansions in the speed of processing and the breadth of distribution. It has also brought new approaches to connection and an ever increasing need to construct and operate complex, multi vendor networks. Computer systems allow us to make complex information manipulations millions of times faster than by hand and reduce the risk that we make the same mistakes as we always did.

 

Before any attempt is made to analyze the creation and operation of networks ranging in size from ones covering a single household to global coverage we need to understand the evolving role of computers in the past, the present and the future and how our need to deliver computer power and information to a wide range of users has resulted in complex solutions utilizing a broad spectrum of computer types and transmission mechanisms. Such integration has made the use of standardized approaches of paramount importance

 

In this post we'll take a look at how computer systems, and information use,  have evolved into modern approaches and how the world of standards has ensured this transition from the simple to the complex.

 

The Information Asset


The collection, storage and maintenance of timely information over a wide range of types has been implemented over the centuries by a range of written book-keeping techniques that include wall paintings, scrolls, and both hand written ledgers and typed ledgers.

 

Within a corporation different types of information exist in many forms Corporate level information can include financial records, asset lists, customer profiles, product definitions and specifications, trend analyses, competition evaluations and much more. At the department level information can include function definitions, resource availability, staffing lists, technical specifications, schedules and other operational information. In addition, information such as personal schedules, travel support documents, operating procedures, usernames and passwords is typically collected and saved by individuals.

 

A corporation may also acquire and maintain personal and often private and sensitive information about it's employees including social security and tax information, educational background materials and work history. It may also save  information considered to be useful to the corporation from public sources. Trade laws and restrictions in overseas markets, climatic conditions in countries of operation, demographics, maps and travel instructions are all examples of this type of information. Such collection and storage of information has always presented a number of issues to management, the major ones being:


Ownership - Who, within the organization, owns the information and protects and certifies its accuracy.

 

Control - Who controls the information, its collection, it's use by whom, it's modification, also by whom and when, and its final elimination. (It should be noted that ownership and control may be vested in different individuals or organizational units.)

 

Distribution - How is information distributed, to whom, under what conditions, by what technical mechanisms and what controls are in place to prevent it from  being misused or falling in the wrong hands.

 

The key to successful information control lies in the selection or creation and consequent  implementation of a company wide suite of information standards. Many examples of such standards exist and have evolved over the ages addressing such issues as:


·      The infrastructure needed to create, maintain and use stored information resources.

·      The financial cost of creation, maintenance, protection and final elimination of all forms of information.

·      All machine (if used) and  human factors

·      Measures taken to eliminate the impact of all disasters, natural or manmade.

 

The goal with all collected and distributed information, whether it be stored as paintings on cave walls or detailed writings in ledgers, has always been to meet the objectives of what the authors have defined as the Information Bill of Rights.

  • The right information
  • To the right person or process
  • At the right time
  • In the right place
  • In the right form and format
  • At the right price

Wednesday, April 24, 2024

Spatial Diversity In Wireless Communications

Spatial diversity is one of those fundamental technologies used in wireless communications
(cellular networks, Wi-Fi, satellite communications, and broadcasting) that does not get much exposure. The technology is used to combat fading and improve signal quality, enabling reliable communication links, especially in challenging environments characterized by obstacles, interference, or long propagation distances. Let’s take an introductory look. 

Spatial diversity exploits the spatial dimension of wireless channels by deploying multiple antennas at either the transmitter or receiver, or both. By leveraging spatial separation between antennas, spatial diversity techniques minimize the effects of fading, which result from signal attenuation, reflections, and scattering in multipath propagation environments. Through the simultaneous reception of multiple independent copies of a transmitted signal, spatial diversity enhances the likelihood of receiving at least one strong signal, thus improving the overall reliability of communication links.

 

There are three key methods involved - Selection Diversity, Maximal Ratio  Combining (MRC) and Equal Gain Combining (EGC).

 

Selection Diversity: In selection diversity, multiple antennas are strategically placed to receive the same signal, and the antenna with the highest received signal strength is chosen for further processing. This technique is relatively simple to implement and offers improved diversity gain, particularly in scenarios with moderate to severe fading.

 

Maximal Ratio Combining (MRC): MRC combines signals from multiple antennas with different complex weights, determined based on the channel conditions. By weighting each received signal based on its signal-to-noise ratio (SNR) and combining them coherently, MRC maximizes the received signal power, thereby enhancing the overall signal quality and reliability.

 

Equal Gain Combining (EGC): EGC employs a simpler approach by combining signals from multiple antennas with equal weights. While less complex than MRC, EGC provides diversity gain by mitigating the impact of fading through signal averaging.

 

Spatial diversity offers an effective mechanism to combat fading and enhance signal reliability. Through the strategic deployment of multiple antennas and the application of diverse combining techniques, the technology improves data transmission across a wide range of environments and applications.

Monday, April 15, 2024

Lost Text and Lost Friend: Terry Pardoe and Data Communications, Networking Methods and Protocols

 

In 2003-2004, I collaborated with Terry Pardoe, co-authoring a Network Security book published in 2004. Inspired by its success, in 2005 we began work on another book titled Data Communications, Networking Methods and Protocols, which unfortunately never made it to publication. Fast forward to September 2014, where I had the honor of delivering the opening keynote for the fall semester at New Hampshire Community Technical College (NHCTC) in Nashua. Terry played a pivotal role in making this happen, and during the event, we had the chance to capture the photo here together, proudly holding our first co-authored text.

I first crossed paths with Terry back in 1999 when he joined NHCTC-Nashua as a part-time lecturer and became a subject matter expert in the (sunsetted in 2016) Verizon NextStep program. Despite a rocky start, our relationship quickly blossomed into a close friendship. Terry possessed a remarkable sense of humor, though I can't recall ever seeing him laugh. He sure knew how to make me laugh though. He resided in Nashua, New Hampshire, alongside his wife and family. A few years ago I learned Terry passed away on May 2, 2016 at the age of 76. Years later and I’m just finding out – it happens to us all - people we work with and are friends with – we lose touch when things change. Before we get into the content – here’s a little bit about Terry.

 

Terry was born in the United Kingdom and educated at the Birmingham College of Advanced Technology (Now Aston University). After coming to the states, Terry D. Pardoe was executive vice president of International Management Services Inc. a USA based computer application and training organization for 23 years (until Aug 1999). On his death, he had more than 40 years experience in the design and application of networks, communications and information systems. He was an internationally recognized expert on all aspects of telecommunications and networking including wide and local area networks, TCP/IP based networks, the Internet, intranets, client server computing, data and network security and many other applied areas. He lectured and consulted on a worldwide basis for a wide range of clients including: Digital Equipment Corp., AT&T, Sprint United, Verizon, Citibank, IBM, Honeywell, NT&T (H.K.), SCI (Brazil), etc. Terry worked with all the major agencies of the US Government including NASA, NSA, DISA, US Navy, US Army, IRS and many others.

 

Terry was the author or co-author of over 200 technical texts on computer applications, management techniques and data communications including text to support the first Java seminar available on a worldwide basis. He authored many www pages which include complex graphics, Java applets and JavaScript.

 

Today I found an old CD-ROM with all text, images, etc intact of the Data Communications, Networking Methods and Protocols book. Over the summer I’ll be posting content from that unpublished book here in honor and respect of Terry. It is interesting – 20 years later - a combination of some obsolete but other still relevant technologies.

 

An amazing man and an amazing career, building the foundation for the Internet we have today. Thanks Terry!!


Saturday, February 24, 2024

New Report: Talent Disrupted - College Graduates, Underemployment, and the Way Forward

The Burning Glass Institute and the Strada Institute for the Future of Work have released a new data-driven research report titled Talent Disrupted: Underemployment, College Graduates, and the Way Forward, 2024. The report highlights a concerning trend among bachelor's degree holders in the job market. Only about half secure college-level jobs within a year of graduation, with the rest working in positions that don't require a degree. Many remain underemployed even after ten years, indicating ongoing challenges in career advancement. 

A recommended full read for students, families, policymakers, and educators, here's a few key points from the report:

 

Mismatch of Skills and Job Requirements: The fact that only about half of bachelor's degree holders secure employment in college-level jobs within a year of graduation suggests a mismatch between the skills they've acquired and the skills demanded by employers. This mismatch can contribute to underemployment, where individuals end up working in jobs that don't fully utilize their education and skills.

 

Persistent Underemployment: It's concerning that a significant portion of graduates remain underemployed even a decade after graduation. This suggests that the issue of underemployment is not just a temporary hurdle for recent graduates but a long-term challenge that affects their career trajectories and earning potential.

 

Impact on Career Progression and Earnings: Underemployment can have lasting consequences on individuals' career progression and earnings potential. Working in jobs that don't require a degree or make meaningful use of college-level skills can hinder opportunities for advancement and may result in lower wages compared to those in jobs that align with their education and training.

 

Implications for Higher Education: These findings also raise questions about the effectiveness of higher education in preparing graduates for the workforce. It highlights the importance of ensuring that educational programs align with the evolving needs of the labor market and that graduates possess the skills and competencies required for success in their chosen fields.

Addressing underemployment among college graduates requires ongoing collaboration between educational institutions, employers, policymakers, and other stakeholders. This involves aligning curriculum with industry needs, providing career counseling and work-based learning opportunities, and promoting lifelong learning. By working together, we can better prepare graduates for success in the workforce.

Tuesday, October 31, 2023

Pricing Up a New Maxed Out MacBook Pro Max

I typically buy a new personal laptop computer every 5-6 years. I was a diehard Windows person until Windows Vista came along when I crossed over to the Mac world. I've been a Mac person ever since.

Because I keep my computers for a long time, when I buy I always load up on hardware. I typically go for the fastest processor, most memory and the largest storage drive.  Today I decided to price up a new loaded MackBook M3 Max and was blown away by the price - $7200!


Pondering, I decided to skip the $19 Apple Polishing Cloth. I can always pick one up later.

Granted, maxed out the configuration is for high end niche users and not users like me. My 14 inch M1 Pro is only two years old with 16 GB RAM and a 1 TB SSD so..... I'll just hang on to that for now.