Tuesday, November 28, 2023

Blog Post #10

In The Age of AI Reflection

Artificial intelligence has become a global phenomenon ranging from businesses to privacy to rising global conflicts with the United States. While watching the documentary "The Age of AI," I learned how artificial intelligence is changing life as we know it in the grand scheme of things. PBS FRONTLINE investigates the potential benefits and risks associated with AI and automation, leading to the new industrial revolution of the 21st century and possibly the next cyber apocalypse.


The documentary starts out by showing videos of the popular game of Go in China. The game requires a high level of thinking and strategic gameplay. The documentary highlights the Google AI robot programmed to play this game against the best player. Everyone believed the robot would fail but ultimately the game showed the robot had won. They programmed the robot with hundreds of different moves, that humans have never thought to play in the thousands of years of this game. 

The results of this game started to raise concerns and excitement about the rise of AI. These robots can do the same tasks as humans and potentially more. Although, deemed innovative this puts humans' jobs in danger. In just a few short years AI may be doing more inventive and intense tasks such as teaching, assisting and performing surgeries, and creating business plans. 

The PBS documentary provides different perspectives on the rise of artificial intelligence. Many people view it as a tool to help professionals in various fields. For example, as a college student, we implement ChatGPT for research, editing, and creativity.  In healthcare, AI aids in early disease detection and drug discovery. AI also fosters accessibility improvements for those with disabilities. 

Negatively, the rise of AI could possibly mean the rise of dehumanization, blurring ethical boundaries. As machines replace human tasks, there's a risk of diminishing empathy and personal connections. AI could open new threats to cybersecurity and privacy is at great risk. The complexity of this intelligence is multifaceted, and companies or skilled officials can use it to violate personal information. For example, China has such a complex AI surveillance system, that it promotes public shaming due to the millions of cameras following you 24/7. 


Overall, the emergence of AI has brought about a new era, marked by advancements and challenges. As AI continues to evolve, balancing harnessing its capabilities and addressing societal challenges is crucial for harmonizing technology and humanity.

Wednesday, November 15, 2023

Blog post #7

Diffusion Theory of the Smartwatch



The Diffusion of Innovations theory, developed by Everett Rogers, provides insights into the adoption of new technologies like smartwatches. Based on this theory, the smartwatch has had an extensive evolution. The pioneer of smartwatches was Steve Mann, who collaborated with the Hamilton Watch Company and Electro/Data Inc. with the LED prototype Pulsar in 1972. 


Early smartwatch adoption soared among tech enthusiasts and business people. These smartwatches from the 1970s-1980s provided sports scores, television screenings, radio, stock prices, and weather forecasts. There is also a later adoption of the smartwatch and some people might not find a need for them at all. 

The initial versions of smartwatches were priced between $1,000 and $3,000, making them less appealing to the general public. However, the current market features more accessible options, such as the Fitbit or Apple Watch, which are available at prices ranging from $100 to $500. This affordability has significantly broadened the want for smartwatches. Additionally, if individuals don't see the added value in its unique technological features or do not know how to use them (boomers), they will most likely not adopt this technology. 

The 'boom of the smartwatch' really began in the 2010s due to the multifunctional use of this product. Smartwatches provide a diverse set of features, including fitness tracking, voice commands, mobile payments, and seamless smartphone notifications. Alongside these functionalities, smartwatches offer extensive customization options and currently present a more practical alternative to higher-end watches. 


Smartwatches, while offering convenience, raise significant downsides. Data collection and constant notifications raise privacy concerns and contribute to dependency and distraction, affecting productivity. Prolonged device use may lead to health issues like eye strain and other behaviors. Rapid technological advancements render early models obsolete, causing dissatisfaction. Dependence on technology, frequent charging, and a shorter lifespan compared to traditional watches are additional drawbacks. Excessive device engagement can also lead to social isolation, detracting from meaningful interpersonal interactions.

Overall, the Diffusion of Innovations theory provides valuable insights into the evolution of smartwatches. The Diffusion of Innovations theory not only explains adoption patterns but also emphasizes the need to navigate the complexities of integrating innovative technologies into our lives.

Tuesday, November 14, 2023

Blog Post #6

Antiwar Voices 

The subject of war is often covered in silence, a topic we tend to hide from and prefer not to think about. Mainstream media tends to focus on narratives of global and national conflicts, with the ideology of permitting and promoting war, yet the perspective of peace is never heard. Exploring independent news outlets such as The American Conservative, remain unnoticed in the mainstream news. I am a person who has never heard of these websites until today. These websites and countless other independent peace activists dive into the same war-related subjects but with a shared commitment to promote tranquility and seeking resolutions. 

I had never seen any source of peace advertised on social media or any major news networks. However, with the current Israel-Hamas war, platforms like TikTok and Instagram, typically glorify this tragedy by exposing graphic images and sparking online debates that get off the point. This could just be a consequence of social media or the government's desire to intervene for personal gain. The government possibly does not want people to know about anti-war initiatives sites since they profit off of war. 


It is evident the government will not encourage news channels or social media platforms to encourage topics that will go against their ideology and agenda. These platforms benefit more when associated with the controversies of politics, especially when the audience/ potential voters can favor a political party. Especially as a marketing student, the more viewers these media platforms get, the higher the ratings and the more profit is generated. 


    Additionally, if we do not factor in the intervention of government and mainstream media, these anti-war websites are also very difficult to navigate. The website of Antiwar is especially difficult to navigate as all of the headlines are mashed together, in the same font and color. The color scheme incorporates red, white, black, and blue, making it challenging to distinguish crucial headlines. This lack of differentiation could potentially deter readers as it diminishes the readability and visual appeal of the content.

I believe that all opinions should be accessible and prominently displayed to create an educated and informed society. This openness enables us to form our own well-informed perspectives on challenging topics. After conducting extensive research on Google, I've discovered numerous well-organized and articulate anti-war and pro-peace websites and organizations. Personally, Peace Action is an organization that I just learned about, and they oppose wars and advocate for reduced military spending through public education to influence policy. 




Thursday, November 9, 2023

Blog Post #5

EOTO Technology 


The technology that I learned about is the history and impact of Bluetooth from Luis Rolando Robles Delgado. Bluetooth technology has come a long way, and its historical origins backdate to 910/911 AD in Denmark. This was named after the birth of King Harald Gormsson. He was known to unite the country of Norway in 958 AD. More importantly, he was notable for his dead tooth, which had a greyish-blue color, leading to the nickname "Bluetooth." The Bluetooth logo pays homage to his first initial (H) and the first letter of his nickname (B). It is so interesting that a 20th-century invention was inspired by the mid-900 AD.

In 1996, 'Bluetooth' was the temporary code name for technology created by Jim Kardach, a representative from manufacturer Intel, in a meeting between the former, Ericsson, and Nokia. He worked alongside Jaap C. Haartsen and his team from Ericsson's time to find a way to allow mobile devices to get short-range radio connections. From codename to notable term is a testament to its global effect on society. 

Today, Bluetooth is overseen by the Bluetooth Special Interest Group, which has more than 35,000 member companies in the areas of telecommunication, computing, networking, and consumer electronics, trading sites. They are currently working on advancements in smart living including toys, tools, appliances, lights, etc. 

Luis Delgado provided great insights into the positive and negative impacts of Bluetooth on society. A wide range of devices, including wireless headphones, Apple watches, televisions, cellphones, and countless other devices support Bluetooth in one way or another. Interestingly, Luis noted Bluetooth is incorporated in almost every industry, from healthcare to marketing.  Additionally, this eases the way to transfer data without an internet connection, especially if they are large files.  However, the potential risks associated with Bluetooth are worth mentioning, especially the health risks. There are increases in pregnancy loss, ADHD with children, and radiation exposure, which are all important concerns.


Bluetooth was developed with the goal of enabling wireless, short-range, and low-power communication. It aimed to aid the inconvenience of tangled wires, provide secure connectivity, and enhance the battery life of connected devices. As a result, Bluetooth has become a highly versatile and widely adopted technology, finding applications across a wide spectrum of personal and professional domains.



Monday, November 6, 2023

Blog Post #4

 PERSONAL COMPUTER 

Technology has transformed the world since 3500 BC, but the personal computer has had the greatest impact. The personal computer or 'PC' is a digital computer designed for the use of one individual. It is commonly used for personal and professional purposes. The invention of personal computers resulted from technological advancements, and societal needs, by the creative efforts of engineers, and entrepreneurs. There isn't a single inventor or specific circumstance that led to the creation of the PC, but rather it was a gradual process driven by several key factors. The development of microprocessors and the miniaturization of electronic components allowed for the creation of smaller, more affordable devices. This innovation evolved into a revolution that has forever changed the way we live, work, and connect within society.


In 1975, the public was introduced to the personal computer by John Blankenbaker of Kenbak Corporation. This PC intended for education use, is called the Kenbak-1 Computer. This is widely considered the first-ever PC and significantly influenced the personal computer industry. Blankenbaker himself designed, built, and marketed the Kenbak-1 as a computer for learning machine code programming but, ultimately was unsuccessful.  



After the failure of Kenbak in 1975, the MITS Altair 8800, designed and marketed by hobbyist Ed Roberts, was one of the first commercially successful microcomputer kits. Establishing clear goals made Ed Roberts more successful in enhancing the PC industry, unlike Blankenbaker. Later in 1977, the industry boomed with ready-to-use computers with the Apple II, TRS-80, and Commodore Business Machines PET, among other start-ups. The Apple II revolutionized the industry by introducing the first-ever color graphics.

In the early 1980s, Radio Shack, Commodore, and Apple collectively dominated the microcomputer market, accounting for the production of approximately half a million in existence at that time. The publication of "Domesticated Computers" in Byte magazine in January 1980 marked a turning point as component prices began to decrease, allowing an influx of companies. This triggered an increase in affordable machines called home computers, which achieved remarkable sales figures, reaching millions of units sold. The home computer was utilized for communication with colleagues on your business or office's network, gaming, and executing administrative tasks such as word processing or spreadsheet applications. 

The 1990s was a significant time for PCs and technology in general. This was a dynamic period for personal computers, including rapid expansions of advancements, the internet, and the growth of software and gaming on personal devices. This also was another foundation for computing trends that continue to influence technology today.

Between the 2000s and 2010s, personal computing saw major shifts. First, it witnessed the widespread adoption of the iconic Microsoft Windows XP, which offered improved stability and user-friendliness. Ultrabooks, MacBooks, and gaming PCs gained distinction, while Chromebooks and convertible laptops (2-in-1 devices) became popular. Virtual reality and advancements in hardware were key trends in this era. Simultaneously, desktop computers experienced a decline in sales and usage, while laptops and mobile devices flourished.



The invention of the personal computer has had a profound and wide-ranging impact on our world, transforming various aspects of society, the economy, and communication. Personal computers were conceived to address diverse challenges, such as data processing, word processing, revolutionizing education, boosting productivity, and entertainment.

   Today, PCs come in various forms, including desktops, workstations, laptops, notebooks, and tablets. In 2022, 260 million notebook units were projected to be shipped worldwide, highlighting a substantial surge in global demand and their increasingly multifunctional nature. Personal computers have also revolutionized communication by amping up the speed, reach, and accessibility of global interactions. Tools like email, social media, and video conferencing have become components of communication, both in personal and professional realms.




   However, there are some negative effects the personal computer has on society. The use of personal computers has raised privacy and security concerns, as there is a higher risk of cybercrime. Secondly, there is an increase in health issues amongst users of computers, most popularly Musculoskeletal problems.  Headache and back pain are the most common symptoms associated with prolonged use of computers and the internet. Additionally, the use of personal computers can increase depression and or acts of anti-social behaviors as users can feel the need to not interact in real life as well as constantly compare themselves to what they see with this easy access. Finally, this hinders creativity and increases dependency from the user to the PC. There are a multitude of beneficial tools, like GPS devices, grammar checkers, and calculators. Nevertheless, if you become too reliant you may be in a vulnerable position when it fails to work.

Overall, the personal computer, or PC, has been a transformative force since the mid-late 1970s. It has revolutionized personal and professional life, driven by technological advancements and societal needs. This evolution, marked by the miniaturization of electronic components, has forever changed the way we live, work, and connect with one another. The PC is a testament to the enduring impact of innovation and technology on our world.










Final Blog Post

My Relationship With Technology As a member of  Generation Z , technology has played a prominent role in my life. Early models of mobile dev...