ArticlePDF Available

Manipulation of people's mind through digitalization: Ethic vs Profit

Authors:
  • ORDT: Organization for Research Development and Traininig

Abstract

Digitalization has become a sophisticated medium of daily activities in recent days. The issue with digitalization has created an avenue of debatable topic between ethics versus profit. Although digitalization is making life easier, it is also simultaneously manipulating peoples mind towards profit maximization objective rather that becoming socially responsible considering the ethical practices. This study explore whether technology is manipulating human behavior and is ethical practices being applied. Interview technique was applied to collect data. Altogether 10 respondents were interviewed among these, 5 were the developers and 5 were the users. Coding was applied and data triangulation method was used for analysis. The results suggest that there are multiple ways to attract and divert peoples mind through internet web designing that developers invent to manipulate the viewers. Artificial intelligence and Internet technologies have developed a method of deep faking to manipulate people's minds. The findings suggest that manipulative features are added for pursuing people's behaviours. These manipulative features forcibly attract the users to unknowingly use the application program even if the users do not wish to. Most developers feels that ethical practices need to be maintained towards safeguarding the privacy of people and information, however, the practice of designing triggers towards manipulation. Manipulation of peoples mind are programmed and designed by managerial implicative objective of organization to gain more profit over offering ethical practices.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 9
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal ISSN: 2822-1850
Volume 3, Issue 1, 2023
Page: 9-24
Mani Man Singh Rajbhandari1*, Rajan Karmacharya2 and Shrijana Karki2
Department of Management1*, Department of Computer Sciences (CSIT) 2
St. Xavier’s College, Maitighar, Nepal
Email: manirajbhandari@sxc.edu.np*
Manipulation of people's mind through digitalization: Ethic vs Profit
Abstract: Digitalization has become a sophisticated medium of daily activities in recent days. The issue with
digitalization has created an avenue of debatable topic between ethics versus profit. Although digitalization
is making life easier, it is also simultaneously manipulating peoples mind towards profit maximization
objective rather that becoming socially responsible considering the ethical practices. This study explore
whether technology is manipulating human behavior and is ethical practices being applied. Interview
technique was applied to collect data. Altogether 10 respondents were interviewed among these, 5 were the
developers and 5 were the users. Coding was applied and data triangulation method was used for analysis.
The results suggest that there are multiple ways to attract and divert peoples mind through internet web
designing that developers invent to manipulate the viewers. Artificial intelligence and Internet technologies
have developed a method of deep faking to manipulate people’s minds. The findings suggest that
manipulative features are added for pursuing people’s behaviours. These manipulative features forcibly
attract the users to unknowingly use the application program even if the users do not wish to. Most
developers feels that ethical practices need to be maintained towards safeguarding the privacy of people and
information, however, the practice of designing triggers towards manipulation. Manipulation of peoples
mind are programmed and designed by managerial implicative objective of organization to gain more profit
over offering ethical practices.
Keywords: Artificial intelligence, well-being, technologies, managerial strategies, awareness, intelligent
system, social media
Introduction
Our world has all been digitalized, creating an ever-expanding digital world of data. Digitization has led to a
thriving information economy. Data is valuable because it makes it possible to make better decisions, such as
which consumers should see which advertisements or which individuals should be looked into as potential
scammers. In today's digital world, advertisers have grasped the chance to customize and target
advertisements by using online data about customers. Such information may include the websites visited,
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 10
articles read, and videos viewed, as well as all search terms entered into search engines. This phenomenon is
called online behavioral advertising (Boerman, Sanne & Frederik, 2017).
Manipulation of peoples mind are a preconditional design that most companies are attracted to gain profit.
However, with the momentum of gaining profit, some companies do not always practices the ethical ways.
As a preconditional design, it is a method of influencing preferences through a means of cognitive balance
(Izuma and Adolphs, 2013).
In this digital world, processed and analyzed data serves as the foundation for both human and automated
decision-making processes that ultimately affect the physical world. We use digital technology more
frequently and are relying on them more and more for all types of necessities, whether it is for banking,
media, education, healthcare, or the legal system. This way the digitalization technologies is already
manipulating the people's mind by making the digital technologies as a compulsion means to undertake any
activities. The question arises the issue of being completely dependable in technologies nowadays. However,
most people do not understand who actually benefits from the use of digital technologies. Moreover, people
also have a tendency to ignore the facts about data privacy (Gerhart, 2017).
Big data advancements and intelligent algorithms based on Artificial Intelligence (AI) are essential
components of the technology. These developments, for example, play a role with Internet of Things (IoT)
devices that send information to the cloud (big data) and are at the same time steered by data and algorithms
from the cloud to perform a specific action in the physical world (Royakkers andTimmer, 2018). In some
areas, smart algorithms and intelligent systems are already taking over decision-making from people, for
example, with armed drones, or in smart cars, this has further manipulated the mind of people to demand for
more hi-tech equipment’s for their daily uses also such as Google home, Alexa. Technologies, embedded in
advisory apps on our smartphone or in smart street lights, can be persuasive and may influence our behavior
and autonomy towards manipulation of mind in subtle ways.
AI technologies are progressively being used in nearly all industries and are consequently making their way
into many aspects of our daily lives as a result of expanding processing power and the availability of
extensive, high-quality datasets. However, the application of AI-based algorithmic systems raises ethical
concerns, challenges society norms, and puts into question numerous core principles (Dignum, 2018). This
relates to issues like fairness and discrimination, privacy and human autonomy in semi-automated decision-
making, risks of personal and societal surveillance, and threats to democracy posed by social media
misinformation and threats to human life posed by autonomous weapon systems or drones (Mittelstadt et al.,
2016).
Dark patterns are user interfaces that purposefully mislead users, make it difficult for users to communicate
their true preferences, or manipulate users into acting in a certain way (Jamie, Lior Jacob, 2021). Such
interface design is an increasingly common occurrence on digital platforms including social media websites,
shopping websites, mobile apps, and video games. They can mislead and deceive users, by causing financial
loss, tricking users into giving up vast amounts of personal data, or inducing compulsive and addictive
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 11
behavior in adults and children, this accounts for profit motives of an organization while ignoring the ethical
practices by creating addictive and impulsive ways towards attractions.
Digitalization offers many benefits, including greater access to information, improved communication, and
enhanced efficiency. On the other hand, there are concerns that digital platforms are being used to
manipulate people's behavior and beliefs, often for commercial gain. This issue raises important ethical
questions about the responsibilities of technology companies and the need to balance profit with the
protection of people's rights and wellbeing. Trittin-Ulbrich et al., (2021) states that contribution of
digitalization are designed for social benefit however there are negative consequences for the use of business
growth, culture and political reasons. Therefore the purpose of this study is to explore the development of
digitalization on manipulating people's minds for generating profits and maintaining ethical considerations
towards the users.
Research Questions
1. How is digitalization manipulating people’s minds?
2. How is modern digital technologies implying ethics in operational applications?
3. What are the implication of AI developed for profit or ethic?
Problem Statement
Individuals are losing control over their personal information as they unknowingly give permission to
companies and organizations to collect, store, and use it when they use digital services. Most people are
dragged into this unknowingly assuming every digitization activities are truth/fact which may not always be
the same. People check their devices frequently because of pings, pop-ups, notifications. Once you've been
interrupted, it takes several minutes to get back into the flow, so even just "one small look" can have
significant effects on our ability to concentrate. Likewise, many people are not aware of their loss of privacy
as they normally do not read the terms of usage. Moreover, manipulation of people's minds is instigated by
the profit that organizations make, however, ethics in this connection are not completely taken into
consideration. Although business ethics are primarily a concern to protect consumer’s rights, it is not being
implied as a norms, rights and responsibilities.
Literature review
This section review literature in technological advancement through digitalization and its features of
manipulation peoples mind. The section has been segregated into different sub-titles to make it easier for the
readers to understand the different views offered from various related studies.
Digitalization for individual conveniences
Digitalization refers to the process of transforming a business model through the use of digital technologies,
which offers new opportunities for generating revenue and value. It involves three key elements: sensors,
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 12
devices that enable smart systems, and connectivity integration that links these devices to the computer and
digital platforms. When combined, these three elements allow for the use of predictive and prescriptive
analytics, which can provide effective solutions for businesses and generate value (Porter and Heppelmann,
2014). Digitalization is also defined as the integration of technology, business models, data, and information
in the creation of products, services, and processes (Passi, 2017). This can lead to innovative digital solutions
through the use of new technologies. However, the moral and ethical implications of significant
technological advancements are not always considered. Royakkers et al. (2018) found that digitalization had
various social and ethical implications, including privacy, autonomy, safety and security, balance of power,
human dignity, and justice. These themes emerged as a result of the impact of digitalization.
The collection and utilization of data is driving innovation and the development of new business models
(Acquisti, 2014). Techniques such as geolocation and social media tracking are being used to customize and
streamline operations (Kosinski et al., 2013). Private companies and government entities are driving the
digitalization trend, which has both micro and macro-level effects. People are becoming increasingly
concerned about how their personal data is collected and stored, resulting in the creation of privacy-focused
services and regulations (Bogusz, 2019). Digital tools and data are also influencing individual behaviors and
societal norms, such as addiction to computer games and changes in online and offline behavior due to
surveillance technologies. The impact of digitalization extends beyond individuals to society as a whole, the
environment, and issues of equity. Makinen and Junnila (2023) mentions data surveillance must be for the
benefit of society rather that for the purpose of business gain.
Digitalization manipulation and awareness
The concept of manipulation refers to covert influence, which can be interpreted as deceptive or involving
cunning and subterfuge. This type of hidden influence can be observed in interpersonal interactions and is
increasingly present in interactions with intelligent technologies, both online and offline (Floridi, 2014). This
critical perspective on human-machine interactions is part of a broader trend that identifies techno-social
engineering as a problematic process in which technologies and social forces combine to influence our
thoughts, perceptions, behavior, and actions in potentially concerning ways (Klenk, 2021). Successful
manipulation depends on identifying and exploiting people's decision-making weaknesses, similar to
coercive tactics. Manipulation violates morality because it disregards people's autonomy by withholding
information and preventing them from making decisions that align with their values and beliefs. Such
deceptive online behaviors also violate the requirements of the General Data Protection Regulation (GDPR)
for transparency and fairness since users are not adequately informed about the collection and processing of
their data (Boynton and Portnoy, 2013).
Deceptive practices used in online businesses are becoming more apparent. Because of information
technology, these practices can be used effectively and cheaply on a wide scale, in various contexts, and with
great sophistication. The goal of such tactics is to influence users to make purchases, prolong their usage of a
service, and convince them to accept privacy-intrusive features. This results in the infringement of user's
rights to the protection of their personal data and exposes them to privacy risks. Online manipulation
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 13
disregards legal protections and deprives uninformed individuals of their capacity to make independent
decisions (Bongard-Blanchy, 2021).
Manipulation methods have the power to alter people's decisions, behavior, and eventually the behavior of
the society made up of these individuals by undermining their autonomy through covert manipulative
strategies (Botes, 2022). Example: advertising a product, but emphasizing the scarcity of the product, which
prompts the consumer to make a rapid decision based on that assumption, while otherwise the consumer
could have taken more time to explore his or her options without the pressure of the product's alleged
scarcity. Although the person is unable to determine the actual quantity of products that are available,
proving the covert nature of manipulation, the person is nevertheless able to weigh his options in light of the
information presented with respect to the product, albeit under time pressure, which may be exploiting
individual weaknesses.
Ethics in Digitalization
The ethical principle of autonomy is based on the idea that human beings should always be treated as
individuals with inherent worth and dignity, rather than simply as resources to be exploited (Botes, 2022). In
practice, this means respecting people's rights and choices. However, in our increasingly digitized world, it
can be difficult to determine what autonomy truly means and how manipulative techniques can interfere with
people's decision-making processes.
Manipulative technologies can change the way people perceive certain situations and influence their beliefs,
ultimately affecting the choices they make. If this process occurs without people's awareness, they are being
controlled by these technologies and used for their own ends, rather than being respected as autonomous
individuals. The integration of technology into society creates a situation where manipulative individuals can
erode people's autonomy and control over their decision-making process (Klenk, and Hancock, 2019).
Individualized customer information is crucial to the success of online commerce. By collecting and
analyzing customer data, companies can gain valuable insights into their customers' preferences and
behavior, which can help them to personalize their marketing and improve the customer experience (Sarathy,
Robertson, 2003). However, the collection and use of such data can also raise concerns about digital privacy.
Consumers are increasingly aware of the potential risks associated with sharing personal information online,
such as identity theft, data breaches, and other forms of cybercrime. This has led to a growing demand for
greater transparency and control over how personal data is collected, stored, and used by businesses.
Ethic vs. Profit
Some technologies, which initially aimed to persuade people toward better lives and futures, unwittingly
changed into manipulative technologies that are used to satisfy the intentions of their coders and other
relevant stakeholders, even though those intentions may not be in line with the values and beliefs of the
people these technologies are targeting. However, the line between manipulation and persuasion is frequently
fuzzy and unclear (Botes, 2022). The proliferation of manipulative technologies has been exacerbated by the
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 14
prevalent ethic vs. profit viewpoint, wherein pursuing the goal of financial gain often takes prominence over
principles of ethics. Despite the fact that there are enough ethical principles in place to protect people from
manipulative technologies, a thorough analysis of the components of these ethical principles is still lacking
(Botes, 2022).
Since the beginning of the Internet, people have been trying to make money from online users, using various
marketing techniques such as online sales and advertising. Digital marketing has expanded to include
methods such as email, SMS, and web advertising, but the question remains about what types of marketing
techniques are ethical or immoral online (Gautam, 2014). While standard advertisements are commonly
used, some methods, such as pop-up ads that cover the whole screen, cookies that track users' behavior on
websites, and spam emails, may not be considered ethical. Digital marketing that is consent-based can be
effective as it considers the needs and interests of the consumer, but if the consumer is uninterested, the
advertisements may not be effective (Rowley, 2001).
Diversion of people's mind through digital techniques
The phenomena of social media instructions, diverting concentration out of the task at the moment and in the
direction of social media is social media distraction. Both internal and external stimuli may be present
(Wilmer et al., 2017). The distraction can come from outside influences such as receiving notification or
from within oneself when the user begins thinking about social media, such as unanswered messages. When
people are working on a task, pensive thinking might cause internal distraction. As an example, prior
research showed that while learning via the internet, students' thoughts frequently turned to social media
(Hollis and Was, 2016).
Rather than letting people make judgments when they are relaxed, composed, and at their best, certain
algorithms are intentionally created to target consumer vulnerabilities and actively direct their attention to
specific aims (Kant, 1996). Current research has started to look at how technology can limit users' autonomy.
For instance, a lot of online games and platforms are made to elicit a dopamine reaction in consumers that
keeps them coming back for more. Games and social media products drive users to seek the positive
feedback from the engagement to the point where their lives might be disturbed. Technology companies
create a vulnerable consumer by using addictive design patterns (Brenkert 1998). Addictive design threatens
consumers' autonomy by manipulating them and taking advantage of their behaviors.
The covert or concealed nature of manipulative practices, which try to convince or influence people against
their inherent values and beliefs, is what explicitly links manipulation and autonomy. Pervasive online
manipulative strategies are unethical to the extent that they limit or hinder someone from being able to
manage their own lives according to their own free will decisions. The most serious issue facing society
today, in this perspective, is not the exploitation of cognitive biases and vulnerabilities, but rather the fact
that many of these exploitative techniques are made to be carried out covertly, giving the hackers influence
over how individuals perceive their future tenses (Botes, 2022).
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 15
Methodology
This study employs a qualitative research approach to explore the topic manipulation of people's minds
through digitalization, focusing on the ethical considerations versus profit motives. The research involve a
total of 10 respondents, consisting of 5 developers and 5 users who were adult and professionals. The
respondents were selected from users groups as well as developers employed by organizations engaged in
digital technologies, ensuring a diversity of viewpoints. Open ended interview techniques was used for data
collection. There were two sets of questions created, one for users and for the developers, which contained
16 questions in the interview schedule. The interview schedule was designed to explore the respondents'
experiences, attitudes, and viewpoints about the ethical consequences of manipulating their minds and the
roles that profit motives serve in this instance. Data was analyzed using coding and data triangulation
analysis method. All data collected from the interviewees were transcribed for validity. Codes were
developed and categorized and interpretations were made to give further meaning to the data. These coded
data were triangulated for analyses purpose.
Findings and discussion
This study explores the sociological perspectives of manipulation from the general users and the
professionals who develops the digital application for the purpose of user friendly application that
contributes towards easiness with ongoing life style. However, many organization and developers are
intending to gain profit but have limited concern for ethic.
Manipulative mind through digitalization
In this study we explored how developers and user perceive the use of digital application as practicing for
profit or ethic. Preferential influence towards persuading the individuals and groups to purchase good is a
cognitive balance of manipulating features that are applied in the applications (Izuma and Adolphs, 2013).
The results suggest that most user’s claims that digital applications have manipulative feature that have also
manipulated their emotions and behavior. In this connection user 1 mentions Yes, I have encountered
advertisements that persuade us to purchase products or services. For instance, schools and colleges promise
parents to improve their children's grades and easily pass exams if they enroll their children, using emotional
appeals to attract them. Manipulative features can be decisive, and information shared to public can have
preferential influential ingredients, which allows them to trust not knowing that the information shared are
correct. These features are activated by the developers of the application/programme. In the view to reflect
the experiences shared by a user, she states;
“I have come across news articles containing manipulative content with a political agenda. Some
media sources intentionally present biased information that can misrepresent facts to shape public
opinion and influence readers’ emotions. I have seen scammers spreading false information,
promoting fake donation requests, and using deceptive tactics to gain our personal information or
financial advantage”.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 16
Similarly user 2 says
“If YouTube videos count as well, then yes !! I have encountered numerous such videos on YouTube
that look like they are presenting facts but in reality are just modification of data in such a way that
the viewers might get manipulated easily. Spiritual videos of some fake gurus and some Political
commentary news vlogs are the examples”.
In the same connection, user 3 states;
“Yes. Fair and lovely advertisement, Air conditioner advertisement, many more. You can say all
contents, ads are made to persuade and manipulate”.
There could be multiple ways to attract and divert peoples mind. Additional content and advertisements can
persuade individual towards craving for products through these medium. The sudden arousal of desire to
wanting these products can be deceiving to many. This can be the ways to generate profit for the
organization. Overlapping of content on applications can also attract peoples mind towards craving for such
services or products. Moreover, digitalization have grown far too much that sometimes the content in the
application can never be true or correct. This is where breach if ethic can be a major concern, while many of
the individuals believe and trust such displays as real but in reality it can be a fake.
Moreover, AI and IT have developed so much in a ways that deep faking have come into practices to
manipulate peoples mind. They have been a lot of debate to avoid such deep faking content. However, many
people are already manipulated by then. The AI and IT have moved rapidly with multiple designs that
somehow defy the intelligent minds of the people. Not all individuals are digitally intelligent, and have
knowledge about being manipulated through some unavoidable contents, for them, emotionally captivating
can be easy, thus profit can be maximized.
Supporting this view user 4 says;
“I'd say persuade rather than manipulate. I see lots of advertisements and have worked on multiples.
The main objective of advertisement is to elicit an emotion and help people solve their problems. But
being said that, there will always be black hat players in the scene everywhere. But, me personally, I
don't would be easily manipulated by ads or news as I always fact check things and search for cold
hard proof before I take any action”.
Similarly user 5 have the opinion who says;
“Yes, news article on online portals manipulates my thought process for example side effects of junk
food”.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 17
The developers are working for the organization or design the application and content for the organizations.
In general developers who are working on their own not for the profit but ill practicing ethical behavior by
hacking data base and deep faking portraits. In such circumstances, ethic practices is totally considered to be
ignored and this is also illegal. While many are aware of the cyber law crime, it hardly have any action plan
unless some secured database is hacked. However, small content that persuade to manipulate emotional
behavior of people are ignored by many. This have prioritized the developing agencies to create more of
manipulative nature in content to persuade the peoples mind to divert and crave for making decision for
wanting such services and products.
Supporting this view, Developer 3 says
“Targeted advertisement I think directly affects the decisions people make. While emotion of people
is concerned to attract them with additional content, developers instigate to design such products
where people unknowingly put themselves in a position to view such content. The IT have become
superficially intelligent to determine the user’s behavioural pattern through the devices they are using
and the frequency of their visiting pattern”.
Developer 4 indicates;
“Most common ways that digital technologies or platforms have been designed to influence people's
behavior are - they are simple, easy to use, easier to understand, interactive, addictive, targeted
contents, etc.”
Moreover, developers feel that ethical practices must be maintained to safeguarding the privacy of users. In
this connection, developers 1 states the line between the ethic and profit must also be maintained. Impractical
ways of exploiting the users through manipulative pop-ups and advertisement can misguide the user while
surfing the particular application. Developers 1 mentions;
“Companies should prioritize user privacy and well-being, implement transparent data practices, and
explore alternative revenue models to strike a balance between ethical implications and profitability”.
The clarity of using the application can have various positive consequences to the users. This enables the
users to trust the application and can use is without hesitations. In this regard, developers 2 says;
“They should make sure to clarify all the consequences about their products and services that they
sell to the end users”.
Internet data are not secured, and this data from the personal sources can be used for any purposes. However
the application claims that any information provided by the individuals can be contained and can be
compromised. Moreover, some application can seek for the approval of individuals before using the
information through multiple choices such as, allow and blocks. However, if an individuals do not give
access to the use of personal data, the application may not work or function properly. In some cases, if the
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 18
application is not given access, the application cannot be used for further assistant. This is one way of
manipulation that is generated by the application while developing it. Although the application claims the
information will not be shared it is however, being compromised. This is how the users are manipulated.
Furthermore, it can be trusted that only few information are being compromised and although being
manipulated through the use of application some general data can be compromised for which a developer 5
claims that;
“The designed system should take only general data of customers that do not harm in user decision
making”.
In any circumstance, whether the application claims to be safe, it may not be safe, somehow or the other, the
information can be compromised and this information may not be shared to the users. This is how the users
through the digital technologies are being manipulated. More sophisticated application are being developed
and more of modern devices are required to use the application. In other ways, a developers of an application
are also manipulating the users/individuals to acquire the modern devices. Moreover, while understanding the
usages of applications with the technologies both the manufacturer of the modern devices and the developers
have the role of manipulating the users if they are to be kept consistent with the digitalized era. Either ways it
can also be understood that users are always being manipulated and this manipulative behavioural pattern
persuade the user/individuals to acquire a new device and the use of modern applications facilities that is
specifically designed to be used by the end users who so ever.
Ethic versus profit
Ethic versus profit is a long debate in managerial implications. Managerial implications can create stress
whether to incline towards profit or towards applying an ethical consideration (Rajbhandari and Jans, 2023).
This stress through the implications of managerial objectives more significantly addresses towards gaining
profit. Although many organizations claims to be practicing ethical behavioural pattern for the benefit of
individuals who uses the device and application are just claiming to win the trust of users. In practicality, all
organization and application have the manipulative features to attract users for which profit can be
maximized. The presence of mindfulness must be transformed to the users. Developer 1 states;
“Companies and developers vary in their awareness of the consequences of design decisions on users,
but there is increasing recognition and action being taken to prioritize user well-being through ethical
design practices. However, some may still prioritize engagement and profitability over user welfare”.
In the same line developer 4 also mentions;
“Companies should always use costumer's personal information ethically or safely and protect it from
unauthorized and illegal use and also from misuse of the information. Companies can use customer's
information for targeted advertising while making profit but the information should be regulated with
the algorithms within the company only and it should not manipulate the data and sell to partner or
others companies for any of the business or personal use. Company should always take responsibility
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 19
to take care of their customer’s information and should always use that information properly and
ethically”.
Although, government in some countries have implied a cyber law for unethical behavior, there are still few
elements involved in hacking into personal data. This could be possible with a modern device or through the
use of installed application in their devices. In recent years, manipulation are being converted into ethical
practices by having the default application already installed in the device where updates are regularly
available. Some of the application installed are permanent and may not be uninstalled or disabled.
Technically, the unused application can be disabled, most users are not technologically intelligent and some
may not have time to look into details of what the applications offers.
While managerial implicative approaches are applied in any organization, the motive to profit maximization
cannot be ignored. However, deceiving users through the modern devices and forcibly implying to use the
application can also be controlled to some extent that trust is developed.
Developer 4 mentions;
“Growing uses of artificial intelligence and machine learning have made development of digitization
in human life to the next level. AI can do users work very fast and repeatedly in the same speed. AI
can understand the behavior of user and acts like a friend. Though human can use ethics for work but
AI can be unethical if it fails or altered it's programs. It can kill many life”.
In the same line, developer 5 says
“Growing AI can change people jobs and there lifestyles. AI will make people mind dull and
unproductive”.
Profit maximization policy of an organization can have multiple goals. This could be either a personal goal
or the organizational goal. In between a thin line of personal goal and organizational goal, a personal goal
can be very intense. Rajbhandari (2011) illustrate a FOSS model (Focus, Optimistic, Striving and Smile) to
explain the intentions of an organizational leader to achieving personal goal or to maintain organizational
goal. While organizational goal is reflects towards welfare and well-being of employees and users with the
intention by applying Social Corporate responsibility (CSR), however, personal goal deflects the intention of
an organizational leaders towards achieving for themselves. Developer 2 states;
“Yes. Because they have not built the applications for helping the poor ones. The founders are doing
and building to make billions so that they can ride Lamborghini”.
In the same vein, developer 3 mentions;
“Definitely. The only purpose for designing application for most of the cases is to make a profit.
Because running an application requires a lot of resources”.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 20
While discussing on drawing a thin line between ethics versus profit, it is certain that most organization are
working to generate a profitable business. For this ethic can be compromised. To support this view, user 1
states;
“Yes, companies are developing the applications with the intention of making profit. there is huge
competition in market in innovative development field. They monetizes the user data, enable
subscription mode,etc. for making profit”.
In the same connection to support the views user 3 mentions;
“Yes, companies develop applications to make a profit. They do so to sustain their business, achieve
a return on investment, meet market demand, outperform competitors, utilize monetization models,
and fulfill obligations to investors and stakeholders”.
Agreeing the views of the statement profit can be the primary motives of an organization before the ethical
practices, user 4 also mentions;
“Why would they create applications? If not for creating profit? Unless they're a heavily funded
organization that can survive without profit, every application/software is designed to create profit,
may the profit be in short term or longer term in others”.
User 5 have the same opinion to express for which she states;
“Yes, majority of the corporates exist to generate profit”.
There are multiple ways to manipulate users, this manipulative features in an application or in any modern
devices forcibly attract the user to unknowingly use the program. Would this be unethical to users but it can
be a profit maximization strategy for the organization. A developers can have a major role in applying this
strategies by installing a bug program that enables the users to accept the usability. The consequences can be
different. In this connection, user 2 supports his view by stating his experiences, he says;
“Advertisement can be one example, they first make us insecure about ourselves just to make us buy
their product. Yes, that one time when I was searching for online course details and the next day I got
call from them if I want to register for course (very terrifying and confusing at the same time)”.
The program engraved into an application activates while online and the program can initiates to direct the
link towards another platform, which users are forcibly linked to a different platform despite their
unwillingness. Some application are designed in such a manner that it can be almost impossible for the users
to end the program. The skip and stop button are almost invisible or not present in the running application.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 21
Organization today are craving for success in a competitive world. This is also due to rapid growth in
modernization and technological innovation. General people are left with multiple choices of products and
services. Earlier manipulation was physically created by pattern of colourful design, however, clever people
could easily avoid such attractions. In this modernization era, manipulation are seen observantly in service
sectors, moreover these manipulation in service sector have also connected with internet technology and
artificial intelligence both in physical products and in computerized programmed application. Either ways this
manipulative simulation is forcibly designed to attract consumers.
Towards gaining competitive advantage and absolute advantage in this digitalization era can be difficult. This
is because of frequent development in technology through new innovation in AI and technology, whether it be
the computerized application towards serving the purpose of consumer or in general.
Nevertheless, the competitive advantage is gained by the organization by manipulating the peoples mind by
either attracting to involve into the application knowingly or unknowingly. This can be dangerous for people
while most data could be leaked and the security of breach with data and social personal security is publicly
shared.
Ethic and profit, does it go along together? While ethic is maintain, sacrifices in profit margin have to be
compromised while organization concentrate in profit, ethic have to be compromised. However, most
organization and developers do not want to compromise profit with ethic. The norms of managerial practices
have diverted into profit making for the entrepreneurs while on the other hand, general people using the
application demand for ethic. A thin line that boarder the relationship between ethic and profit may be erased
and this can be a concern for all.
In recent days, entrepreneurial development also have taken a different shapes. Entrepreneurs are innovators
and the developers considers themselves as an innovators since they are the one who innovate new digitalized
application. This can be considered correct with some few stance of entrepreneurial definition, however,
entrepreneurs are also visionary who creates an environment for social welfare, wellness and well-being.
These aspects are ignored by the developers and the organization who initiates to develop new application
that basically focuses on the organizational products forcing the individuals/people to get involved. This is
where the thin line boarder is being erased. Moreover, the developers and the organizations claiming to
becoming an entrepreneurial role are taking up a role of a managerial and administrative activities rather than
becoming a socially obliged to people using the application.
Conclusively, Manipulation of peoples mind through digitalization is almost existence in every devices and
application that the user prefer to use. However, it is also evident that some of the application are
programmed by default, which can also instigate the user’s to use it while making a purchase of modern
technological equipped devices. Moreover, technological devices in recent development, either it be google
home, alexa, apple home kit or any other technologically equipped devices through internet connection can
have a features engraved to manipulative nature. This is due to the sophisticated use of the modern devices
which manipulative the people’s minds and is developed with an intention of profit maximization objectives
by an organization.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 22
Somehow, it is also evident that most organizations and developers are inclined towards maximization of
profit where, ethics versus profit can collide. Nevertheless, the choices are made for profit maximization
while considering ethic are found to be minimal. The results also suggests that manipulation of peoples mind
cannot be made free for the sake of organizational existence for a longer run and to compete with rival
organizations or developers. The strategies for manipulation of peoples mind are evident and is made possible
through the use of technological intelligence or AI technologies. These programmed technologies are making
the users dependable to rely on their applications and program. The results also suggest that both
organizations and developers have equal possibilities to develop an AI to manipulate the peoples mind either
knowingly or unknowingly and almost all users’ falls under the manipulations streamline. This is also because
the dependency of technology has created the social and professional environment to knot tightly. However,
would they be any escape from the AI and digitalization culture in this or coming era? Would digitalization be
free from manipulative features? And will ethic be given a first priority over the profit maximization in near
future, while most organization speaks about becoming socially responsible towards the community?
References
Acquisti, A. (2014). The Economics and Behavioral Economics of Privacy. Privacy, Big Data, and the
Public Good: Frameworks for Engagement (pp. 7695).
Boerman, Sophie C., Kruikemeier, Sanne and Zuiderveen Borgesius, Frederik J. (2017). Online Behavioral
Advertising: A Literature Review and Research Agenda. Journal of Advertising, 46(3), p. 363-376 /
114.
doi:10.1080/00913367.2017.1339368
Bogusz, C. I. (2019). Digital identity beyond verification. In Digital Transformation and Public Services
(pp. 214 232).
Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koenig, V., & Lenzini, G. (2021). I am Definitely
Manipulated, Even When I am Aware of it. It's Ridiculous! - Dark Patterns from the End-User
Perspective. Designing Interactive Systems Conference 2021, (pp. 105-116).
doi: 10.1145/3461778.3462086.
Botes, M. (2022). Autonomy and the social dilemma of online manipulative behavior. AI Ethics, 1(1), 81-93.
Boynton, M.H., Portnoy, D.B., Johnson, B.T. (2013) Exploring the ethics and psychological impact of
deception in psychological research. IRB 35(2), 713.
Brenkert, G. G. (1998). Marketing and the vulnerable. The Ruffin Series of the Society for Business Ethics,
1, 720.
Dignum, V. (2018). Ethics in artificial intelligence: introduction to the special issue. Ethics Inf Technol 20,
13 Gautam, H. (2014). Ethical Issues in Digital Marketing. Global Journal for Research Analysis,
3(10), 102-103.
Floridi, L. (2014). Fourth revolution: How the infosphere is reshaping human reality. Oxford University
Press USA.
https://link.springer.com/article/10.1007/s43681-022-00157-5
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 23
Hollis, R. B., and Was, C. A. (2016). Mind wandering, control failures, and social media
distractions in online learning. Learn. Instr. 42, 104112.
doi: 10.1016/j.learninstruc.2016.01.007
Izuma, K & Adolphs, R. (2013). Social Manipulation of Preference in the Human Brain. Neuron 78, 563
573. doi: 10.1016/j.neuron.2013.03.023
Jamie Luguri, Lior Jacob Strahilevitz. (2021). Shining a Light on Dark Patterns. Journal of Legal Analysis,
13, Issue 1, , Pages 43109, https://doi.org/10.1093/jla/laaa006
Kant, I. (1996). Groundwork of the metaphysics of morals. In: Gregor, M., Kant, M. (eds.) Practical
philosophy, p. 429. Cambridge University Press, United Kingdom, Cambridge (1996)
Klenk, M. (2021). Manipulation: Sometimes hidden, always careless. Review of Social Economy, 79(1), 113-
115.
Klenk, M., & Hancock, J. (2019). Autonomy and online manipulation. Internet Policy Review.
Kosinski, M., Stillwell, D., & Graepel, T. (2013). Private traits and attributes are predictable from digital
records of human behavior. Proceedings of the National Academy of Sciences of the United States of
America, 110(15), 5802 5805.
Mäkinen, L.A. & Junnila, J (2023). Finnish young people’s perceptions of privacy regarding data collected
when using their mobile devices. Samuelsson, L Cocq, C, Gelfgren, S & Enbom, J (Eds.) in Everyday
life in the culture of surveillance. Nordicom, University of Gothenburg. 145-166.
doi.org/10.48335/9789188855732
Mittelstadt, B. D., Allo, P., Taddeo, M., Wachter, S., & Floridi, L. (2016). The ethics of algorithms: Mapping
the debate. Big Data & Society, 3(2). https://doi.org/10.1177/2053951716679679
Paasi, J. (2017). Business models in the future of manufacturing. Towards a new era in manufacturing. Final
report of VTT’s For Industry spearhead programme (pp. 2433). VTT Technical Research Centre of
Finland.
Porter, M. E., and Heppelmann, J. E. (2014). How smart, connected products are transforming competition.
Harvard Business Review, 92(11),6488.
Rajbhandari, M. M.S. & Jans, D. (2023). Managing people through managing organizational stress. Research
annals of Xavier’s Nepal, Management and Social Sciences series, 5.0, 1-12.
Rajbhandari, M.M.S. (2011). Strengthening the strength of public-private partnership model in education a
case study of durbar high school in Nepal. ED516967.
Rowley, J. (2001). Remodeling marketing communications in an Internet environment. Internet Research, 1,
3, 203-212.
Royakkers, L., Timmer, J., Kool, L., & van Est, R. (2018). Societal and ethical issues of digitization. Ethics
and Information Technology, 20(2), 127142. doi.org/10.1007/s10676-018-9452-x
Sarathy, R., Robertson, C.J. (2003). Strategic and Ethical Considerations in Managing Digital Privacy.
Journal of Business Ethics 46, 111126.
https://doi.org/10.1023/A:1025001627419.
SXC Journal of Management and Social Sciences (SJMSS)
Published by St. Xavier’s College, Maitighar, Kathmandu, Nepal
This work is licensed under a Creative Commons Attribution 4.0 International License. 24
Trittin-Ulbrich, H., Scherer, A. G., Munro, I., & Whelan, G. (2021). Exploring the dark and unexpected sides
of digitalization: Toward a critical agenda. Organization, 28(1), 825.
https://doi.org/10.1177/1350508420968184
Wilmer, H. H., Sherman, L. E., and Chein, J. M. (2017). Smartphones and cognition: a review of
research exploring the links between mobile technology habits and cognitive functioning. Frontiers
in Psycholology, 8, 605. doi: 10.3389/fpsyg.2017.00605
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Persuasive online technologies were initially designed and used to gain insights into the online behavior of individuals to personalize advertising campaigns in an effort to influence people and convince them to buy certain products. But recently, these technologies have blurred the lines and morphed into technologies that covertly and gradually manipulate people into attaining a goal that is predetermined by the algorithm and disregards the decision-making rights of the individual. This may lead to people exercising decisions that do not align with their personal values and beliefs, and rob them of their autonomy—an ethical principle, in the absence of which the application of these technologies may be unethical. However, not all technologies that are persuasive are necessarily manipulative which require the careful consideration of a couple of elements to determine whether or not technologies are manipulative and ultimately whether their application is ethical or not. In this article, we analyze the ethical principle of autonomy and unpack the underlying elements of this ethical principle which must be considered to determine whether the application of a technology is ethical or not in the context of it being persuasive or manipulative.
Article
Full-text available
Dark patterns are user interfaces whose designers knowingly confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions. They typically exploit cognitive biases and prompt online consumers to purchase goods and services that they do not want or to reveal personal information they would prefer not to disclose. This article provides the first public evidence of the power of dark patterns. It discusses the results of the authors’ two large-scale experiments in which representative samples of American consumers were exposed to dark patterns. In the first study, users exposed to mild dark patterns were more than twice as likely to sign up for a dubious service as those assigned to the control group, and users in the aggressive dark pattern condition were almost four times as likely to subscribe. Moreover, whereas aggressive dark patterns generated a powerful backlash among consumers, mild dark patterns did not. Less educated subjects were significantly more susceptible to mild dark patterns than their well-educated counterparts. The second study identified the dark patterns that seem most likely to nudge consumers into making decisions that they are likely to regret or misunderstand. Hidden information, trick question, and obstruction strategies were particularly likely to manipulate consumers successfully. Other strategies employing loaded language or generating bandwagon effects worked moderately well, while still others such as “must act now” messages did not make consumers more likely to purchase a costly service. Our second study also replicated a striking result in the first experiment, which is that where dark patterns were employed the cost of the service offered to consumers became immaterial. Decision architecture, not price, drove consumer purchasing decisions. The article concludes by examining legal frameworks for addressing dark patterns. Many dark patterns appear to violate federal and state laws restricting the use of unfair and deceptive practices in trade. Moreover, in those instances where consumers enter into contracts after being exposed to dark patterns, their consent could be deemed voidable under contract law principles. The article also proposes that dark pattern audits become part of the Federal Trade Commission (FTC)’s consent decree process. Dark patterns are presumably proliferating because firms’ proprietary A-B testing has revealed them to be profit maximizing. We show how similar A-B testing can be used to identify those dark patterns that are so manipulative that they ought to be deemed unlawful.
Article
Full-text available
Ever-increasing numbers of human interactions with intelligent software agents, online and offline, and their increasing ability to influence humans have prompted a surge in attention toward the concept of (online) manipulation. Several scholars have argued that manipulative influence is always hidden. But manipulation is sometimes overt, and when this is acknowledged the distinction between manipulation and other forms of social influence becomes problematic. Therefore, we need a better conceptualisation of manipulation that allows it to be overt and yet clearly distinct from related concepts of social influence. I argue that manipulation is careless influence, show how this account helps to alleviate the shortcomings of the hidden influence view of manipulation, and derive implications for digital ethics.
Article
Full-text available
Digitalization has far-reaching implications for individuals, organizations, and society. While extant management and organization studies mainly focus on the positive aspects of this development, the dark and potentially unexpected sides of digitalization for organizations and organizing have received less scholarly attention. This special issue extends this emerging debate. Drawing on empirical material of platform corporations, social movements, and traditional corporations, eight articles illuminate the various negative implications of the digitalization of work and organization processes, particularly for workers, employees, and activists. In this introduction,we contextualize these valuable contributions that underline the dangers of the ubiquity andsimultaneity of digitalization and begin to sketch out potential avenues toward a comprehensivecritical agenda of digitalization in organization studies.
Article
Full-text available
Advertisers are increasingly monitoring people's online behavior and using the information collected to show people individually targeted advertisements. This phenomenon is called online behavioral advertising (OBA). Although advertisers can benefit from OBA, the practice also raises concerns about privacy. Therefore, OBA has received much attention from advertisers, consumers, policymakers, and scholars. Despite this attention, there is neither a strong definition of OBA nor a clear accumulation of empirical findings. This article defines OBA and provides an overview of the empirical findings by developing a framework that identifies and integrates all factors that can explain consumer responses toward OBA. The framework suggests that the outcomes of OBA are dependent on advertiser-controlled factors (e.g., the level of personalization) and consumer-controlled factors (e.g., knowledge and perceptions about OBA and individual characteristics). The article also overviews the theoretical positioning of OBA by placing the theories that are used to explain consumers’ responses to OBA in our framework. Finally, we develop a research agenda and discuss implications for policymakers and advertisers.
Chapter
Massive amounts of data on human beings can now be analyzed. Pragmatic purposes abound, including selling goods and services, winning political campaigns, and identifying possible terrorists. Yet 'big data' can also be harnessed to serve the public good: scientists can use big data to do research that improves the lives of human beings, improves government services, and reduces taxpayer costs. In order to achieve this goal, researchers must have access to this data - raising important privacy questions. What are the ethical and legal requirements? What are the rules of engagement? What are the best ways to provide access while also protecting confidentiality? Are there reasonable mechanisms to compensate citizens for privacy loss? The goal of this book is to answer some of these questions. The book's authors paint an intellectual landscape that includes legal, economic, and statistical frameworks. The authors also identify new practical approaches that simultaneously maximize the utility of data access while minimizing information risk.
Article
Contemporary marketing is commonly characterized by the marketing concept which enjoins marketers to determine the wants and needs of customers and then to try to satisfy them. This view is standardly developed, not surprisingly, in terms of normal or ordinary consumers. Much less frequently is attention given to the vulnerable customers whom marketers also (and increasingly) target. Though marketing to normal consumers raises many moral questions, marketing to the vulnerable also raises many moral questions which are deserving of greater attention. This paper has three objectives. First, it explores the notion of vulnerability which a target audience might (or might not) have. I argue that we must distinguish those who are specially vulnerable from normal individuals, as well as the susceptible and the disadvantaged - two other groups often distinguished in marketing literature. Second, I contend that marketing to the specially vulnerable requires that marketing campaigns be designed to ensure that these individuals are not treated unfairly, and thus possibly harmed. Third, I maintain that marketing programs which violate this preceding injunction are unethical or unscrupulous whether or not those targeted are harmed in some further manner. Accordingly, social control over marketing to the vulnerable cannot simply look to consumer injury as the measure of unfair treatment of the vulnerable.