“Fire!” The right time to scream

Your computer is on fire
Edited by Thomas S. Mullaney, Benjamin Peters, Mar Hicks, and Kavita Philip (MIT Press, 2021)


A general policy guides many executives today: create businesses that can harness the power of new technologies, scale as quickly as possible and show investors that these businesses can compete in an uncertain future. The result is the rapid expansion of digital (and digitally driven) companies that have become more relevant and essential to our lives. But these companies have also created a wide range of unintended harmful consequences.

The impetus for business to deal with these consequences, including personal data breaches, algorithm bias, dissemination of misinformation, spreading inequality and environmental damage, is growing. Activist shareholders, among many other stakeholders, are advocating for “responsible technology” policies and a strong link between technological ethics and executive compensation packages. Socially and environmentally conscious consumers are voting with their wallets, encouraging businesses to re-evaluate their products and objectives, including their role as employers in a diverse, employed workforce. The global epidemic has only added to the pace of change.

But how can companies maximize the positive effects of technology while minimizing all the bad? This is the next big challenge for corporate leaders and our system as a whole. And for this reason Your computer is on fireThis is the best technology book of 2021, which is an important lesson for business leaders to address this question.

In some ways, this is an unusual choice. The book contains 16 articles instead of a single description. And they are written by educators for the purpose of STEM students, humanitarians, technologists and sociologists. But executives who make their way through the 400-plus page Your computer is on fire It will end more from a little restless. The authors fearlessly break the most sacred assumptions of the technology industry, forcing us to rethink everything we have come to accept as true about our digital life and the multibillion-dollar digital transformation going on within our company. Titles such as “Gender is a corporate tool,” “A network is not a network,” and “Coding is not empowerment” make no sense.

How can companies maximize the positive effects of technology while minimizing all the bad ones? This is the next big challenge for corporate leaders and our system as a whole.

The first and most exciting article in the collection, “The Cloud is a Factory,” challenges Nathan Ensmenger, an associate professor at Indiana University, to help readers think differently about one of the most transformative technologies for a generation of businesses: the cloud.

What exactly is a cloud? The quick answer is that it is a set of computing-software services, from email to inventory-tracking software that users can access via the Internet, not through a desktop or internal server. Cloud computing platforms have proven to be a powerful tool for testing new methods and experimenting with new technologies, including advanced analysis and 3D printing.

But in simpler terms, the cloud is a set of computers located somewhere in a data center, computers that require metals and plastics, as well as physical elements such as electricity, water and humans. One of a kind … an industrial factory. As Ensmenger observed, a typical data center produces between 350 and 500 megawatts of energy and requires about 400,000 gallons of fresh water per day for cooling.

Still because of the term Clouds Used as a metaphorical device, and since the cloud is thought to be a sophisticated, virtual technological solution, the computer industry has been able to transcend its long history of controlling physical infrastructural resources. In the past, when a traditional factory contaminated the water supply or crippled workers, public policy responded, albeit belatedly. But while the cloud remains largely uncontrolled, all of its negative factory-like effects are less reported. “Let’s bring this deliberately vague and ethereal metaphor back to Earth based on a broader history of technology, labor, and the built environment – before it’s too late,” Ensmenger pleaded.

In another article, “Your Robot Is Not Neutral,” Safia Umoza Noble, an associate professor at the University of California, Los Angeles, called for a deeper, more general understanding of the processes involved in data formation, which she originally possessed. A social structure. Just as race and gender are social constructs, we make decisions instead of things that are immutable or exist naturally. So, data has come to dominate our lives. The problem is that there is no connection between the data creation and the historical social practice that informs their construction. When data is generated from a set of discriminatory social processes, such as creating statistics on policing a city, it is often impossible to detect that this data also reflects methods such as African-American, over-policing in Latin, and unequal arrest rates. And low-income neighborhoods, Nobel argues. “The notions of accuracy and neutrality of information are so deeply embedded in training and lectures about what data is that ‘mathematics cannot discriminate because it is mathematics’,” he wrote.

The articles also address gender inequality, another problem for the high-tech world. Mar Hicks, an associate professor at the Illinois Institute of Technology, tells the story of sexist recruitment and firing practice in the UK computing sector in “Sexism is a Feature, Not a Bug” and how electronic computing technology has become an “abstraction of political power” in the form of machines. Not just accidents, “Hicks wrote,” but also a feature of how the systems were designed to work and how they would continue to work without significant external interference. “

Although each author sees a different problem through his or her own unique lens, the collection of essays Your computer is on fire Achieves descriptive solidarity. History — especially the history of the computer and industrial society কাজ serves as a purposeful and cunning organizational device, because technology is not for looking back at the industry, only for the front. The industry has built itself on the concept of constant rethinking, and the authors know that the task of mining history for text is not in its DNA. But it should be. As its author Your computer is on fire Clearly, many are at risk when we ignore history and fail to think more humanely about computing.

Of course, companies have to deal with the losses caused by technology so that the losses do not outweigh the profits. The book does not provide specific recommendations for creating such responsible technology policies, nor does it outline broad policy changes. But Your computer is on fire Succeeding by forcing us to adjust to how we think and talk about things at the center of business and society. And that, the author notes, is a great starting point for change.

University of Tulsa professor Benjamin Peters, author of “A Network Is Not a Network,” explains: “Technology will not give its promise or curse, and technology observers should avoid both the utopian dreamer and the dystopian catastrophe. The world is on fire, but It will either be cleansed or destroyed at the exact day and time that the self-proclaimed prophets of profit and destruction foretold.

Honorable mention:

Futureproof: 9 rules for people in the age of automation
By Kevin Rouge (Random House, 2021)

Artificial intelligence and advanced robotics make it possible for machines to perform tasks that a person once needed. According to some estimates, about half of the jobs in the US economy could become obsolete. But what if our future reality is shorter than that? What if automation displaces millions from their jobs and at the same time improves healthcare diagnostics and slows down climate change? And how can we improve in such a hybrid environment? The question at the heart of this compelling book Proof in the futureWritten by New York Times Columnist Kevin Rouge. With honesty and humor, Rouge tries to correct some of the flaws in the way we think about AI and suggests ways we can make the most of our benefits. Whether or not he advocates “consequentialist thinking” or “digital prudence” as part of a standard STEM curriculum, Russ has made a significant contribution to the scholarship around our AI future in this highly readable and effective book.

A World Without Email: Reconstruction in the Age of Communication Overload
By Cal Newport (Portfolio / Penguin, 2021)

Did you receive my email? Email — and its growing volume ছে has become a threat to 21st Century workers’ survival. But Cal Newport, an associate professor of computer science at Georgetown University, believes we can live without it. In his great texts, A world without email, Newport deals with how workplaces create “hyperactive bee minds”, communicates, responds and shares information quickly and quickly, and the resulting problems. This type of work, Newport argues, forces people to constantly check their inbox or message platform, which reduces their ability to concentrate and focus, causes mental fatigue and dissatisfaction with work. His highly accessible book outlines four general principles for redesigning the work world without email: focus capital policy (pay attention as a value asset), process principle (develop work processes that maximize the value generated from your focus), protocol. Principles (structural work processes to optimize coordination among employees), and specialization principles (allow employees to work more deeply on lesser things). While changing the email culture in our workplace will not be easy, Newport notes that this is “one of the most exciting and influential challenges we face today.”

Leave a Reply

Your email address will not be published.