Please reload

Recent Posts

delete: The Virtue of Forgetting in the Digital Age - Book Review.

May 6, 2020

1/6
Please reload

Featured Posts

Techlash: Our Dependence on High Tech Oligarchies and the Long-Term effects on Society.

March 13, 2020

 

Introduction

Google, Amazon, Facebook, Apple and Netflix (GAFAN) are the five most prominent and wealthiest companies today. In turn, GAFAN has much insight into our daily lives as they all collect data during the routine use of their respective service or product. The vast amount of collected data means GAFAN has incredible power, and some have argued they have become too powerful and might misuse that data, resulting in popularized term: technology backlash (techlash). This paper discusses government involvement in regulating GAFAN via anti-trust, the consumer and corporate responsibility, consumer behavior, and some of the technologies’ most significant issues in the present and future both to itself and humanity.  Lastly, technology permeates every part of modern life from human interaction, business transactions and the Christian faith. The Christian must understand the potential issues; such as addiction potential and technologies inherent dual use, for both good and evil, if we are to experience our ultimate good.

 

Government Involvement

Anti-trust

           One of the ways governments reign in companies is via antitrust lawsuits and regulations. The word antitrust comes from one such practice in the Gilded Era, whereby creating trusts, or giving company stocks to a designated trustee; one could use that business mechanism to exclude competitors, thereby potentially creating a monopoly. Margarethe Vesteger, who was the European Commissioner for Competition until 2019, argues that market intervention is sometimes necessary. She explains that for the most part, businesses can regulate themselves, but in some cases, the presence of competition means that there is a never-ending race (Tedtalks: Margrethe Vestager. 2017).

 

Microsoft and Google

           In the early 2000s, Microsoft was the target of an antitrust lawsuit by government regulators. The lawsuit alleged that Microsoft not only had a monopoly due to its market share but also employed monopolistic practices.  The lawsuit alleged that Microsoft had enjoyed unfair market dominance, for example, with Internet Explorer and its pricing schemes with Original Equipment Manufacturers (OEMs) by charging the OEMs a fraction of the cost as compared to fair market value. As a result, the Government, via its lawsuit, forced Microsoft to adjust some of its practices creating a freer market for all. Those results trickledown even today; for example, a user can download and install, without limitation, any web browser they wish to use.

           Conversely, Google has found itself in the crosshairs of government regulators in the European Union (EU). The parent company of Google was fined three times between 2017 and 2019 for a total of over 8.5 billion Euros (European Commission, 2019) or just under 9.5 billion USD, at today's rates. In 2017 the fine was for giving itself an illegal advantage in regard to the comparison-shopping function. In 2018, it was for utilizing the Android operating system to provide itself with the unlawful power in regard to its search engine. And most recently in 2019, for restrictive contract clauses that prevented website operators from displaying advertisements of competitors’ offerings. In 2019, 50 Inspector Generals launched a probe into a possible antitrust case in the United States. While the examination may last for several years to come, regulation on Googles search and advertising may require changes to its business practices.

Individual Responsibility

           When we talk about technological oligarchies like that of Google or Facebook, we are assuming a sizeable semi-autonomous company that permeates our daily lives with products like social media, email and the like. Products, so easy to use and so ubiquitous that navigating everyday life is a challenge without them. For all the good these products and services do, there is a growing backlash. This backlash is not only about the size and dominance of the companies but, also about their products. Specifically, how those products obtain and retain our mental attention and fuels growing concerns about psychological dependence.

The Attention Economy

           For example, the instant messaging platform Snapchat has a feature called 'Snapstreak' which awards a badge next to a conversation. The badge increments daily, providing the two participants send at least one message every twenty-four hours. The feature itself might be considered harmless, but others argue that this kind of gamification requires the user to invest their time and attention repeatedly. Mark Griffiths, a professor from Nottingham Trent University, explains that the more resources a person spends on a given action, the more they are compelled to continue (2018). And in the case of Snapstreaks, this requires daily attention and investment and to suddenly stop would mean that investment was a waste, which only compels the user to move forward. There is even a bill by Senator Hawley to ban the Snapstreak feature along with auto-play and endless scrolling.

Growing Backlash

            The ACM Code of Ethics states that the central aim of computing should lessen the adverse effects of technology (Anderson, 1992). But Google and Facebook are huge companies with many facets and reaches into the global marketplace. It would be unfair to blame the problems discussed and others singularly on any one person or engineer. Yet, recently some engineers who were employed by companies like Google are raising concerns. Those concerns are about just how powerful and influential these programs and services are. For example, Tristan Harris, who worked at Google but now is the director for the Center for Human Technology, argues technology companies are in a race to the bottom when trying to get and keep our attention (2017).  Each engineer does have an ethical responsibility when creating the products and services discussed, but not absolute responsibility in the sense they can control how Google, Facebook or Snapchat uses the product that they help create. But as we see with Tristan Harris, some technology insiders are exercising their moral compass to help influence those companies’ directions.

Corporate Responsibility

           There is much debate about morality and ethics in business, especially so for a company like Google, which is quite vast, touching almost every area of consumer technology with some Government proclivities as well. One of the sub-topics in this debate question is if large companies like Google are morally responsible for the ubiquity of their wares and how those products and services are put to use in society. The economist Milton Friedman, speaking on Corporate Social Responsibility (CSR) said that the company’s obligations are to the shareholder; it is they who decide what to invest and where, so long as the business activities are legal (Friedman, 2009).

Liability and Morality

           For example, those investments can funnel back into itself to ethics programs that guide future product development or not at all. Amy Sepinwall in the Moral Responsibility of Firms, speaking on the emotional aspect of blame, argues that the basis is guilt, which firms are incapable of feeling (2017). Further, Josh Hasnas in the same work explains that a firm’s moral responsibility is not a prerequisite for holding it civil or criminally liable when they do step outside the law (2017) thus lending partially to Friedman's statement as well. So even if the argument can sustain that one cannot hold a firm morally responsible, they are still of course bound by the current laws, which is one tool used to modify behavior.

Popularity and Ubiquity of GAFAM

           Technology companies enjoy unprecedented popularity in American society. As of 2018, 74% of Americans thought that technology companies have a net positive effect on society but then flip to say they do not trust them to do what is right (Smith, 2018).  But even more so, certain technologies and platforms have intrinsic powers of amplification that allow the creation of a system or application that then can outreach to millions, if not billions of people at once, which should imply some obligation to society to ensure proper use, even if perhaps unregulatable. Further, to parlay on the Pew Poll, technology companies have the potential to do the wider community a lot of good. For example, both Facebook and Google have applications and services that allow anyone in crisis to help find loved ones.

           In the final analysis, technology companies do have a civic responsibility towards the stewardship of their product and services. However, technology companies are not people, and thus one cannot equate human emotions to them (guilt, blame, etc.). And while a firm's primary obligation might be to its shareholders, it also has the mandate to operate within the law. And with 'techlash' apparently on the rise, firms might work a little harder repairing the trust that the public has lost in them to do the right thing.

Consumer Behavior

           Consumers do play an essential role in the growth of technology companies in terms of willingness to have data collected on them and the collective societal drive for physical accumulation and price sensitivity.  In terms of data, there is an unwritten quotient in today's technologically advanced society that states that a user is to give up data about themselves in exchange for a product or service, both free and paid. One study in the UK showed that roughly 2/3 of citizens would be willing to exchange their data so long as there was a value proposition involved (Ridley-Siegert, 2015) and another study by Experian showed that 90% of Americans are aware of data collection in general (Experian, 2019).

Data Collection and User Agreements

           But to say that the rule is unwritten is a bit of a misnomer. The practice is written and presented in the form of user agreements and privacy policies. It becomes unwritten because users are aware of the collection but maybe not in specific terms and to what depth. Still, consumers want a personalized and specialized experience, be it in an application, shopping experience or their newsfeeds and they are willing to provide the data to make it happen. The data creates a feedback loop; the more data consumers offer, the better companies are at serving them, thus growing their market share, dominance and ubiquity.

 

           In modern society, it is likely impossible to escape from technology and data collection entirely and a recent Gallup poll shows most Americans view technology firms positively (Saad, 2019) and likely don’t want to escape technology anyway. However, it is also true there is no existential force requiring the consumer to embrace the level of technology adoption in their lives wholeheartedly as we do today. But a big driver and the foundation of many tech companies in the 'free' model, where the product is free but the consumer consents to data collection. The idea of free is a powerful psychological motivator. One study researched consumer behavior when comparing two products, one inferior but free, to a superior product and significantly discounted. Irrationally, many people still chose the free option even though there was not much difference in price to the excellent product which arguably provided more utility to the consumer (Shampanier, Mazar, and Ariely, 2007).

           The same is true when a user downloads a free smartphone application. For example, the app provides a service such as step-counting but tracks location data on the mobile device. The location data, in aggregate, is likely worth more than whatever convenience the user received at the time from counting steps. Most users would agree that their location information is probably worth more intrinsically for privacy reasons than that of the accumulation of their steps. However, the power of free sometimes overrides logic and serves to benefit the big tech companies that collect and organize this information that guides their growth decision making process when expanding to offer new services and products.

Technology Companies Serious Future Issues

 

Addictive potential from heavy usage

           With the ubiquitousness of technology comes the potential for its overuse and in some cases, dependence. While the medical research community is still investigating the dependence aspect, there has been some movement on the topic. In 2018 the World Health Organization created a new diagnostic category for video gaming disorders and included it in the International Classification of Diseases (ICD), version 11, (Humphreys, 2019). While such a diagnosis is not present in the Diagnostic and Statistical Manual of Mental Disorders (DSM-V), it is slated for investigation down the road (Kamenetz, 2019).  

 

           Both Internet and offline gaming are salient examples as most of the major technology companies, including Google, are working or have released products that further blur the lines of use. Google Stadia, for example, is an online video game streaming service that requires only minimal hardware and an internet connection (Pichai, 2019). Services like Stadia are not addictive by themselves; however, the immediate access and certain video game design aspects (loot boxes and other in-game reward systems) have the possibility of effecting dense usage patterns. Other technologies and services as well have this potential for overuse, Facebook and smartphones, for example. One study of about 3,000 people found that about 1% were addicted to the Internet and about 4 percent for cell phone overuse (Sharma, Rao, Benegal et al., 2017).

Artificial Intelligence (AI)

 

           Another potential problem is with AI, as it relates to the industrial complex of today's largest tech companies is that of technological singularity. The definition of the singularity event is the creation of hardware and software that surpasses the computational ability of the human brain (Burkhardt, 2011). The threat is that once the singularity reaches existence, the computer systems will refine and redesign themselves dramatically far beyond what the human mind will ever be able to achieve. Such an event, self-thinking computers with the ability to improve and redefine themselves, could be an existential threat to the human race. Such an invention is called the 'last invention'. 

 

           The largest technology companies and a variety of smaller firms are working on AI. For example, Google created AlphaGo, AI technology that conquered the best players at Go. Go is an ancient Chinese game that has more board configurations than there are "atoms in the universe" (Silver and Hassabis, 2016). AlphaGo not only was trained on how to beat its opponent, but AlphaGo also partially taught itself, intaking games to develop its unique strategies (Chen, 2016). While the singularity event may or may not happen, it is essential to take ethical considerations in AI today, primarily to ensure AI is operating without bias (Yapo and Weiss, 2018). 

 

Christian Perspective

 

Trust In Technology

           Since the invention of the World Wide Web by Sir Tim Berners-Lee and the design of the underpinnings of today's Internet by DARPA, technology brought the world into a new era. Today we have incredible information at our fingertips with the help GAFAN. Applications on pocketable computers can tell us where to go, speakers in the kitchen can tell us the weather and cars will soon talk to each other and drive themselves to avoid collisions and hopefully reduce the loss of life. Further, with the invention and proliferation of AI, our world will likely evolve again. But even before the modern technology discussed, the technique of yesteryear was the domestication and utilization of horses and chariots, which changed the nature of travel and warfare (Kelekna, 2009). Yet in Psalm 20:7, just as with today, the Christian should not trust solely on technology but rather God. As Bertrand Russel argues, we both admire machines and technology for their beauty and power but also hate them because they impose slavery (2004).

 

Technologies Dual-Use

           Technology is and always will be dual use. This duality nature means technology will be used for both good and evil. The critical differentiator of the use is the people who use and are involved in the creation of technology. 'Summum bonum' is the Latin expression that translates to the 'highest good'. In the ethics of the Christian being, the highest good is God, and He is the 'main reference point' (Ciszek, 2014). The Christian must question and investigate how technologies are of use in our lives. For example, does the use of social media distract or detract from the imperative of the Christian always seeking God's Summum bonum in our lives? Or does the use of the Internet provide the feelings of irresistible temptations to lust? For the creators of such technology, it is critical not only to think about the intended implication of the creation but also those unintended (Whiting, 2011) such as social media or gaming services and their methods (loot-boxes, etc.) that might lead to overuse and possibly addition.

Conclusion

           

           It is challenging to find one area that technology does not touch in contemporary life. It has brought us the ability to better ourselves with more information and to help us live better with, for example, broader material goods selection. But left unchecked by either societal, governmental or market corrections, technology companies could lose sight of creating ethical products and services that serve a greater societal good. For the Christian, technology has also enriched and enhanced our faith with, for example, the ability to harmonize with Christians anywhere. Technology has the potential to add 'static' in our communion with God; both as a distraction but possibly losing sight of His precepts and presence in our life.

 

 

References

Anderson, R. E. (1992). ACM code of ethics and professional conduct. Communications of the

ACM, 35(5), 94–99. doi: 10.1145/129875.129885

 

Burbach, R. (2001). Globalization and Postmodern Politics : From Zapatistas to High-Tech

Robber Barons. London: Pluto Press. Retrieved from http://search.ebscohost.com.ezproxy.regent.edu:2048/login.aspx?direct=true&db=nlebk&AN=72444&site=ehost-live

 

Burkhardt, C. (2011). The Trajectory to the "Technological Singularity". The Social Impact of

Social Computing, 94.

Chaining giants; the global techlash. (2017, Aug 12). The Economist, 424, 46. Retrieved from http://eres.regent.edu:2048/login?url=https://search-proquest-com.ezproxy.regent.edu/docview/1927928669?accountid=13479

 

Chen, J. X. (2016). The evolution of computing: AlphaGo. Computing in Science &

Engineering, 18(4), 4-7.

 

Ciszek, M. (2014). Environmental ethics from a Thomistic-personalistic perspective

(implications for the sustainable development concept). PROBLEMS OF

SUSTAINABLE DEVELOPMENT, 9(1), 97-106.

 

Experian. (2019, January 29). 70% of consumers would share more data if there was a perceived

benefit, with greater online security and convenience at the top of the list. Retrieved February 22, 2020, from https://www.prnewswire.com/news-releases/70-of-consumers-would-share-more-data-if-there-was-a-perceived-benefit-with-greater-online-security-and-convenience-at-the-top-of-the-list-300785756.html

 

Friedman, M. (2009). Capitalism and freedom. University of Chicago Press.

 

Harris, T. (2017, April). Retrieved February 9, 2020, from

https://www.ted.com/talks/tristan_harris_how_a_handful_of_tech_companies_control_billions_of_minds_every_day/transcript?language=en

 

Hasnas, J. (2017-03-30). The Phantom Menace of the Responsibility Deficit. In The Moral

Responsibility of Firms. : Oxford University Press. Retrieved 14 Feb. 2020, from https://www-oxfordscholarship com.ezproxy.regent.edu/view/10.1093/oso/9780198738534.001.0001/oso-9780198738534-chapter-6.

 

Humphreys, G. (2019). Sharpening the focus on gaming disorder. World Health

Organization. Bulletin of the World Health Organization, 97(6), 382-383.

 

European Commision. (2019, March 20). [Press Release]. Retrieved from

https://ec.europa.eu/commission/presscorner/detail/en/IP_19_1770

 

Griffiths, M. D. (2018). Adolescent social networking: how do social media operators facilitate

habitual use?. Education and Health, 36(3), 66-69.

 

Kamenetz, A. (2019, May 28). Is 'Gaming Disorder' An Illness? WHO Says Yes, Adding It To

Its List Of Diseases. Retrieved February 28, 2020, from https://www.npr.org/2019/05/28/727585904/is-gaming-disorder-an-illness-the-who-says-yes-adding-it-to-its-list-of-diseases

 

Kelekna, P. (2009). The horse in human history (pp. 175-184). Cambridge: Cambridge

University Press.

 

Levmore, S., & Nussbaum, M. C. (2010). The Offensive Internet : Speech, Privacy, and

Reputation. Cambridge, Massachusetts: Harvard University Press. Retrieved from

http://search.ebscohost.com.ezproxy.regent.edu:2048/login.aspx?direct=true&db=nlebkAN=364813&site=ehost-live

 

Regan, P. M. (1995). Legislating Privacy : Technology, Social Values, and Public Policy. Chapel

Hill: The University of North Carolina Press. Retrieved from

http://search.ebscohost.com.ezproxy.regent.edu:2048/login.aspx?direct=true&db=nlebk&AN=1592&site=ehost-live

 

Ridley-Siegert, Thomas. (2015). Data privacy: What the consumer really thinks. Journal of

Direct, Data and Digital Marketing Practice. 17. 30-35. 10.1057/dddmp.2015.40.

 

Russell, B. (2004). Sceptical essays. Psychology Press.

 

Saad, L. (2019, October 11). Americans Split on More Regulation of Big Tech. Retrieved

February 22, 2020, from https://news.gallup.com/poll/265799/americans-split-regulation-           big-tech.aspx

 

Schmitt, M. N. (2017). Tallinn manual 2.0 on the international law applicable to cyber operations. Cambridge, United Kingdom: Cambridge University Press.

 

Sharma, M. K., Rao, G. N., Benegal, V., Thennarasu, K., & Thomas, D. (2017). Technology

 addiction survey: An emerging concern for raising awareness and promotion of healthy use of technology. Indian journal of psychological medicine, 39(4), 495.

 

Sepinwall, A. (2017-03-30). Blame, Emotion, and the Corporation. In The Moral Responsibility

of Firms. : Oxford University Press. Retrieved 14 Feb. 2020, from https://www-          oxfordscholarship-com.ezproxy.regent.edu/view /10.1093/oso/9780198738 534.001.0001/oso-9780198738534-chapter-9.

 

Shampanier, Kristina, Nina Mazar, and Dan Ariely. "Zero as a special price: The true value of

free products." Marketing science 26.6 (2007): 742-757.

 

Silver, D., & Hassabis, D. (2016, January 27). AlphaGo: Mastering the ancient game of Go with

Machine Learning. Retrieved February 29, 2020, from

https://ai.googleblog.com/2016/01/alphago-mastering-ancient-game-of-go.html

 

Smith, A. (2018, June 28) Public Attitudes Toward Technology Companies. Retrieved from

https://www.pewresearch.org/internet/2018/06/28/public-attitudes-toward-technology-companies/

 

Tedtalks: Margrethe Vestager—the new age of corporate monopolies [Video file]. (2017).

Retrieved February 1, 2020, from https://fod.infobase.com/PortalPlaylists.aspx?wID=104759&xtid=160779

 

Whiting, S. (2011). Towards a Biblical View of Technology. Mount Vernon Nazarene

University, Mount Vernon, Bethel University, Ohio.

 

Wu, Y., Lau, T., Atkin, D. J., & Lin, C. A. (2011). A comparative study of online privacy

regulations in the U.S. and china. Telecommunications Policy, 35(7), 603-616. doi:10.1016/j.telpol.2011.05.002

 

Yapo, A., & Weiss, J. (2018). Ethical implications of bias in machine learning. Hawaii International Conference on System Sciences. Waikoloa Village, Hawaii. 2018.

Share on Facebook
Share on Twitter
Please reload

Follow Us
Search By Tags
Please reload

Archive
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
CONTACT ME

© 2017-2019 By Kenneth LaCroix.