Automation, Algorithms, and Beyond: Why Work Design Matters More Than Ever in a Digital World
- Jonathan H. Westover, PhD
- 2 days ago
- 33 min read
Listen to a review this article:
Abstract: Digital technologies—encompassing artificial intelligence, robotics, algorithmic management, and platform-based business models—are fundamentally reshaping how work is structured, controlled, and experienced. This article proposes that work design serves as a critical lens for understanding and managing these technological transformations. Drawing on sociotechnical systems theory and contemporary research, we demonstrate that technology's impact on work characteristics such as autonomy, skill variety, feedback, social connection, and job demands is not predetermined but depends on design choices, organizational contexts, and individual responses. We outline four complementary intervention strategies: proactively designing work roles during technology implementation; embedding human-centered principles in technology development and procurement; supporting organizational initiatives with macro-level policies; and expanding training beyond digital skills to include work design literacy for multiple stakeholders. The article concludes by identifying research priorities—including reconceptualizing autonomy in machine learning contexts, examining skill preservation mechanisms, and advancing interdisciplinary sociotechnical approaches—alongside practical recommendations for education, policy engagement, and stakeholder influence to ensure that technological advancement serves both human flourishing and organizational performance.
The digital revolution has reached a critical inflection point. Technologies once confined to science fiction—self-learning algorithms, autonomous robots, ubiquitous sensors tracking every workplace action—now permeate organizational life from factory floors to hospital wards to remote home offices. The stakes have never been higher. On one hand, technology promises to eliminate dangerous drudgery, enhance human capabilities, and unlock unprecedented levels of productivity and innovation. On the other, it threatens to deskill knowledge workers, erode autonomy through algorithmic control, and create precarious employment arrangements that leave workers isolated and powerless.
Yet amid the breathless predictions of job apocalypse or techno-utopia, a crucial question receives insufficient attention: How does technology actually change the nature of work itself? This is not merely an academic concern. The quality of work design—how tasks are structured, how much control workers exercise, whether jobs use people's skills, the feedback mechanisms available, and the social fabric of work—profoundly shapes employee wellbeing, learning, safety, and performance. These relationships are among the most robust findings in organizational psychology.
The paradox is that technology's effects on work quality are neither automatic nor uniform. The same algorithmic system can enhance or diminish job autonomy depending on implementation choices. Robotic automation can free humans for higher-skill work or trap them in monotonous monitoring roles. Remote collaboration technology can strengthen or fray social connections. These divergent outcomes reflect design choices—choices about technology configuration, work organization, management philosophy, and regulatory context.
This article argues for placing work design at the center of our understanding of digital transformation. We contend that insufficient attention is given to how technology alters the fundamental characteristics of jobs, leading to a reactive posture where humans must simply adapt rather than a proactive stance where both technology and work are designed to serve human needs and organizational goals. Drawing on sociotechnical systems thinking and contemporary evidence, we map how digital technologies affect five critical dimensions of work design, identify the factors that shape these impacts, and propose multilevel intervention strategies to steer technological change toward desirable futures of work.
The Digital Transformation Landscape
Defining Contemporary Technologies in the Workplace
The current wave of technological change encompasses several interconnected developments that collectively transform work in unprecedented ways. Artificial intelligence (AI) and particularly machine learning represent a fundamental shift: systems that learn from data and improve performance without explicit programming now make complex decisions previously requiring human judgment—from credit approvals to medical diagnoses to employee performance ratings. Unlike earlier automation, AI can handle cognitive and analytical tasks, extending technological substitution beyond routine manual work into domains once considered the preserve of educated professionals.
Robotics has evolved from fixed industrial arms performing repetitive tasks to collaborative robots (cobots) working alongside humans, and increasingly autonomous systems capable of adapting to unstructured environments. The introduction of surgical robots, warehouse logistics robots, and autonomous vehicles illustrates this progression toward systems that can sense, decide, and act with diminishing human oversight.
Algorithmic management delegates managerial functions—task allocation, scheduling, performance monitoring, and evaluation—to software systems. Uber's driver matching, Amazon's warehouse task optimization, and call center quality scoring represent a new form of control where algorithms rather than human managers direct work. These systems generate and depend upon big data—the continuous collection of granular performance information from sensors, digital traces, and connected devices.
The platform economy exploits information and communication technologies to create decentralized marketplaces matching supply and demand, fundamentally restructuring employment relationships. Uber, TaskRabbit, and Upwork exemplify models where work is fragmented into discrete tasks or "gigs" coordinated through digital platforms rather than traditional organizational hierarchies.
Ubiquitous computing describes the embedding of sensors and networked devices throughout work environments, creating what some term the "Internet of Things." Work becomes increasingly mediated through digital interfaces that collect performance data, enable remote collaboration, and blur boundaries between physical presence and virtual connection.
What Makes This Time Different?
While technological disruption has characterized industrialization since the steam engine, several features distinguish contemporary digital transformation. Most fundamentally, the combination of big data and machine learning enables automation of cognitive and analytical work previously immune to substitution. Legal research, financial analysis, and even some creative tasks can now be partially automated, raising questions about which human capabilities remain distinctively valuable.
Unlike past automation that required humans to explicitly program every decision rule, modern AI systems operating through deep neural networks can discover patterns and develop decision strategies that their designers cannot fully explain. This opacity of machine learning models means that as technology becomes more capable, it simultaneously becomes less transparent and predictable—even to its creators. The implications for human control and accountability are profound.
The integration of sensing, data analytics, and automated decision-making creates systems of unprecedented scope and speed. Real-time optimization across entire supply chains, instantaneous matching of workers to tasks, and continuous performance monitoring operate at scales and tempos impossible with human management alone. This compression of time and expansion of scale intensifies both opportunities and risks.
Finally, the emergence of autonomous, self-learning systems that can adapt their behavior through interaction with environments and data represents a qualitative shift. Technology acquires a form of agency, becoming what some scholars describe as "unpredictable as humans." The traditional model of technology as a predictable tool under human control no longer fully applies when machines learn and evolve in ways their designers cannot anticipate.
A Central Role for Work Design
Why Work Design Matters
Work design—the content and organization of tasks, job characteristics, and broader work roles—represents a critical determinant of employee and organizational outcomes. Decades of research anchored in theories like the Job Characteristics Model, Job Demands-Resources framework, and sociotechnical systems thinking demonstrate that how work is structured shapes motivation, strain, learning, performance, safety, and health.
Jobs offering autonomy (control over decisions, methods, and timing) enhance employees' sense of meaning, enable active coping with demands, and support efficient localized decision-making by those closest to operational variances. Skill variety, task significance, and task identity foster intrinsic motivation and challenge perceptions that sustain engagement and performance. Job feedback and related characteristics like role clarity support mastery, skill maintenance, and continuous improvement. Social and relational aspects—opportunities for interaction, social support, and connection with beneficiaries—meet fundamental human needs for belonging and enable coordination in complex work.
Conversely, poorly designed jobs characterized by excessive demands, insufficient resources, fragmented tasks devoid of meaning, or surveillance-based controls predict burnout, turnover, reduced performance, and compromised safety. The relationship between work design and outcomes is robust across industries, occupations, and cultures.
Technology Affects Work Design—But How?
Digital technologies fundamentally alter these critical work characteristics, yet their effects are neither simple nor predetermined. Technology can enhance autonomy by distributing information that enables localized decision-making—or it can erode autonomy through algorithmic management that removes human judgment from work processes. Automation can eliminate dangerous, repetitive tasks and free workers for higher-skill activities—or it can create monitoring roles characterized by monotonous vigilance punctuated by high-pressure interventions when systems fail.
The same technology implemented differently yields divergent work designs. When advanced manufacturing systems gave operators autonomy to diagnose and resolve problems, performance and wellbeing improved; when similar systems relegated operators to passive monitoring with specialists handling problems, the benefits diminished. When online labor platforms structured work to include team membership and peer interaction, workers experienced greater meaning and support than on platforms lacking such social architecture, despite similar task types.
These variations reveal that technology's impact depends on:
Technology-related characteristics: Type of system, degree of human-centered design, performance and reliability, configurability
Individual factors: Skills, education, personality, trust in technology, technology self-efficacy, adaptive responses over time
Team and organizational conditions: Pre-existing work methods, management philosophy, organizational strategy, level of operational uncertainty, employee participation in design and implementation
Occupational characteristics: Skill requirements, task routineness, knowledge intensity
Macro-level forces: Labor laws and regulations, institutional regimes (e.g., worker councils, unions), national culture, industry norms
This multifactorial causation means that understanding technology's effects requires examining how technical capabilities interact with these shaping factors. Work design becomes the crucial mediating variable—the mechanism through which technology influences outcomes. By mapping these relationships, we can move from passive acceptance of technological "impacts" to active shaping of sociotechnical systems.
The Sociotechnical Systems Imperative
The principle of joint optimization—designing technical and social systems together to achieve both human and performance goals—originated in 1950s studies of coal mining but remains highly relevant. Sociotechnical thinking recognizes that technical systems and work organization are interdependent; optimizing one while neglecting the other yields suboptimal results. Effective systems design considers how technology and work structure together enable humans to manage operational variances, maintain situational awareness, and deploy their capabilities effectively.
Contemporary technologies challenge sociotechnical principles in new ways. When systems become opaque even to designers, when algorithms adapt faster than humans can track, when accountability for system behavior becomes ambiguous—the very notion of "human-in-the-loop" control requires rethinking. Some scholars argue that rather than preserving traditional human authority over increasingly autonomous systems, we should grant technology a more independent role while redistributing accountability to system designers and operating organizations.
Despite these challenges, the core insight remains: technology and work must be designed together, considering human capabilities, needs, and limitations alongside technical functionalities. The following sections examine how contemporary technologies affect specific work characteristics, revealing both the opportunities and risks that sociotechnical design must address.
Organizational and Individual Consequences of Technology-Mediated Work Design
Job Autonomy and Control
Autonomy—influence over work processes, decisions, methods, and timing—is among the most consequential work characteristics. High autonomy enhances motivation by creating psychological ownership, reduces strain by enabling active coping with demands, and improves performance by allowing workers to manage operational uncertainties locally rather than escalating issues through hierarchies.
Decision-Making Within Work Processes
Information technologies can increase autonomy by distributing knowledge that empowers localized decision-making. Real-time performance data, access to organizational information systems, and internet-enabled knowledge resources allow workers to solve problems and coordinate activities without constant supervisory oversight. Self-organizing online communities like Fablab and the Maker Movement illustrate how accessible technologies enable distributed innovation and control.
Yet automation and AI can profoundly reduce autonomy through what researchers call the "out-of-the-loop" problem. When systems automate complex tasks, leaving humans to monitor and intervene during failures, workers become supervisory controllers of processes they no longer fully understand. In aviation, the introduction of autopilot increased safety initially but eventually contributed to accidents when pilots lost situational awareness and struggled to manually intervene during automation failures. The erosion of active engagement and skill practice impairs humans' ability to resume control when needed.
Algorithmic decision-making represents another autonomy challenge. Credit scoring algorithms replace loan officers' judgment; applicant tracking systems filter candidates before human review; warehouse management systems assign tasks algorithmically. These systems optimize efficiency across entire operations but remove human influence over specific decisions. When algorithms are opaque—decision logic invisible even to developers—workers cannot meaningfully review or contest algorithmic judgments, experiencing what some describe as "black box" control.
When a regional health system introduced tablet-based scheduling and task tracking for home care nurses, the system algorithmically optimized visit sequences and durations. Initially, this appeared beneficial—nurses received clear schedules without manual planning. However, the algorithm's rigid time allocations forced nurses to cut some visits short while spending excess time at others where patients needed less care. Some nurses began working unpaid hours to provide adequate care, restoring autonomy and meaning but at personal cost. The system measured compliance with algorithmic schedules, not quality of care, creating tension between professional judgment and automated control.
Autonomy Over When and Where to Work
Information and communication technologies theoretically enable flexible work arrangements—remote work, flextime, results-only work environments—that enhance boundary control: influence over when and where work occurs. For professionals, this flexibility can improve work-life integration and job satisfaction.
Yet technology-enabled flexibility reveals paradoxes. Constant connectivity creates expectations for immediate availability, with coworkers and clients demanding rapid responses regardless of time or location. Teleworkers report that the flexibility that attracted them to remote work becomes constrained by pressures for perpetual availability, leading some to deliberately "unplug" to restore boundaries.
In the gig economy, promised flexibility often proves illusory for low-wage workers. Uber promotes "be your own boss" flexibility, yet drivers face substantial soft controls: surge pricing algorithms create pressure to work during specific times; drivers must accept rides without knowing destinations or fares; canceling unprofitable rides risks deactivation; messaging urges drivers to continue during peak periods ("demand is very high—don't stop now!"). For precarious workers lacking negotiating power, organizational flexibility (on-demand labor) substitutes for genuine worker autonomy.
A food delivery platform marketed flexible scheduling to attract riders. However, the algorithmic rating system penalized riders who declined orders, even for valid reasons like vehicle problems. High ratings determined access to preferred delivery zones and shifts. Riders reported feeling compelled to accept all orders and work extensively to maintain ratings, despite nominal freedom to set schedules. The platform's flexibility primarily benefited the company (variable labor supply) rather than workers (meaningful control).
Implications
Job autonomy in digital work environments depends critically on design choices. Technologies can distribute information and decision authority, empowering workers—or they can centralize control through algorithmic management that removes human judgment. The type of technology matters (information systems that decentralize versus communication systems that centralize), as does the context (skill levels, management philosophy, operational uncertainty, organizational strategy). Individual responses—mistrust leading to under-use, or over-reliance leading to skill degradation—further shape outcomes.
Preserving meaningful human control while leveraging automation's benefits requires deliberate design. This includes maintaining transparency so humans understand system behavior; providing override capabilities; designing systems to augment rather than replace human judgment; and matching automation levels to task characteristics and operational conditions.
Skill Variety and Use
Work that utilizes diverse skills, involves meaningful task sequences, and provides intellectual challenge fosters intrinsic motivation, job satisfaction, and skill development. Technology profoundly influences these characteristics.
Skill Enhancement Through Automation
When automation eliminates dangerous, repetitive, or physically demanding tasks, it can free humans for work requiring judgment, creativity, and interpersonal skills. Robots handling hazardous materials allow workers to focus on system oversight and problem-solving. AI systems processing routine data analysis enable professionals to concentrate on complex interpretation and client interaction. This "upgrading" narrative suggests technology complements human capabilities rather than simply replacing them.
Skill Degradation and Deskilling
However, automation can also create jobs characterized by passive monitoring with little active skill use—a recipe for disengagement and skill atrophy. Operators supervising automated process control in chemical plants or power generation face prolonged periods of vigilance with infrequent but high-stakes interventions. Sustaining attention and maintaining intervention capability under such conditions is extremely difficult.
Aviation provides the most documented example of automation-induced skill loss. As cockpit automation increased, pilots' manual flying skills deteriorated due to lack of practice. Regulatory authorities now mandate minimum manual flying time and extensive simulator training to preserve capabilities needed when automation fails. Similar concerns arise for autonomous vehicles: if drivers become passive supervisors, their ability to resume control during system malfunctions is questionable.
Lean manufacturing and business process reengineering, enabled by information systems, can fragment work into narrowly specialized tasks that reduce skill variety even when technology does not directly automate. Enterprise resource planning (ERP) systems sometimes create standardized workflows that limit professional discretion and skill application.
When a teaching hospital introduced the Da Vinci robotic surgery system, attending surgeons increasingly performed critical steps themselves rather than progressively delegating to residents as they gained proficiency. The technology's magnified visualization and precise controls created efficiency pressures and liability concerns that shifted challenging work away from trainees. Residents completed training with legal authorization to perform robotic procedures but insufficient experiential learning. Skill development requires practice "at the edge of competence" with appropriate feedback—conditions that robotic surgery inadvertently disrupted through changed work allocation.
Micro-Task Fragmentation
Platform-based business models enable extreme task fragmentation into "micro-tasks" coordinated through digital marketplaces. Workers on Amazon Mechanical Turk perform tasks like image tagging, data validation, or survey completion—often taking seconds or minutes, paid pennies per task. Such work lacks task identity (completing meaningful wholes), variety, and significance, resembling industrial piecework but applied to cognitive tasks. While providing income flexibility for some, these jobs offer minimal skill development, meaning, or progression.
Implications
Whether technology enhances or diminishes skill use depends on strategic choices about which tasks to automate and how to structure remaining human work. An "informate" strategy—using automation-generated data to provide feedback and decision support to empowered workers—yields higher-quality jobs than an "automate" strategy focused purely on replacing human effort. Organizations must balance efficiency gains from automation against the need to preserve skill diversity, learning opportunities, and meaningful challenge.
Occupational context matters: highly skilled knowledge work may resist deskilling, while routine work faces greater automation risk. Individual and organizational responses also shape outcomes—some workers actively craft roles to restore challenge and meaning despite constraining technologies.
Job Feedback and Related Learning Supports
Feedback—knowledge of work results—is essential for learning, performance improvement, and motivation. Related characteristics like role clarity and task identity also support mastery and effectiveness.
Technology-Enhanced Feedback
Wearable devices, sensors, and data analytics can provide real-time, personalized performance feedback. Call center agents receive immediate coaching on communication patterns; athletes track biomechanical performance; salespeople monitor activity metrics. Information systems can distribute operational data throughout organizations, helping employees understand how their work contributes to broader goals.
In manufacturing contexts, devolution of real-time production information to operators enabled them to detect and resolve problems more quickly than when information remained centralized with specialists, improving both performance and operator engagement.
Impaired Feedback and Learning
Conversely, automation can reduce feedback and disrupt learning. When pilots rely heavily on autopilot, they receive less tactile and visual feedback about aircraft behavior, impairing their situational awareness. The lack of hands-on practice during automated operation degrades the skills needed to interpret subtle cues during manual flight.
In surgical training, robotic systems paradoxically provided both enhanced feedback (magnified visualization) and reduced feedback (loss of tactile sensation, physical separation from patient). More significantly, changed work allocation—attendings performing more tasks themselves—reduced residents' opportunities for guided practice and feedback, the core mechanism of skill acquisition. Trainees completed programs legally authorized to operate robots but without the experiential foundation to do so safely.
Algorithmic Feedback Challenges
Performance monitoring through algorithmic systems creates feedback that can be excessive, punitive, biased, or misaligned with work quality. Uber drivers receive passenger ratings that powerfully affect their reputation and access to desirable work, yet these ratings may reflect factors beyond driver control (traffic conditions, passenger mood). Platform workers often report frustration when algorithmic performance judgments seem arbitrary or when they lack channels to contest unfair ratings.
Healthcare workers using electronic record systems report that documentation demands increase while time for patient interaction decreases—the system provides extensive feedback on compliance with documentation protocols but less feedback on actual care quality.
A hospital introduced mobile telepresence robots allowing remote physicians to conduct rounds in intensive care units. For residents who had previously prepared thoroughly—discussing cases with nurses and reviewing patient histories—the robot enhanced coordination by bringing the attending physician virtually to bedside, enabling richer multidisciplinary conversations. However, for residents who had relied on brief phone reports without direct nurse consultation, the robot exposed their lack of preparation. The attending could now interact directly with nurses, revealing gaps in the resident's knowledge. The same technology thus enhanced feedback and learning for some while creating uncomfortable visibility of deficient preparation for others—demonstrating how technology's effects depend on pre-existing work practices.
Implications
Technology's impact on feedback and learning depends on whether systems are designed to "informate"—providing actionable information that supports human judgment—or simply to monitor compliance. Human-centered automation maintains feedback channels that sustain situational awareness and skill practice. This includes preserving tactile and visual feedback where possible, ensuring transparency about system states and decisions, and creating opportunities for deliberate practice even when automation handles routine operations.
Algorithmic feedback systems should be designed for learning and development rather than purely for surveillance and control. This includes ensuring feedback is timely, specific, fair, and actionable; providing mechanisms to review and contest algorithmic judgments; and avoiding excessive monitoring that creates stress without enabling improvement.
Social and Relational Aspects of Work
Human needs for connection, belonging, and social support make relational work characteristics important for wellbeing and coordination. Technology profoundly reshapes social dimensions of work.
Technology-Enabled Connection
Information and communication technologies can maintain and even enhance social connection across distance and time. Video conferencing, messaging platforms, and social intranets enable remote workers to participate in team discussions, access peer support, and maintain organizational belonging. Research shows that after initial adjustment, teleworkers often develop effective strategies for remote collaboration, and social media can buffer against isolation for distributed workers.
Platform technologies can deliberately structure social interaction. Some crowdwork platforms organize workers into teams with leaders, encourage peer knowledge sharing, and create inter-team competition—building community and support even in digitally mediated work. Externally organized online forums allow gig workers to share information, discuss platform policies, and provide mutual assistance outside formal platform structures.
Technology-Mediated Disconnection
Yet technology-mediated communication can also impair social connection and coordination. Virtual teams often struggle with trust-building, coordination, and conflict resolution compared to co-located teams. Technology-mediated communication reduces social cues (facial expressions, body language, conversational nuance) that enable empathy and mutual understanding. When work occurs entirely through digital platforms with minimal human contact—emails, chatbots, automated task assignment—workers may feel isolated and unsupported.
Drivers for a ride-sharing platform reported that when problems arose—app malfunctions, incorrect fares, passenger complaints—their only recourse was emailing customer support, often receiving generic automated responses. One driver explained: "You email everything. There's no one to talk to. If your app breaks, you just have to wing it." The lack of human interaction for problem-solving created frustration and sense of disconnection from the platform that ostensibly employed them.
Changed Roles and Coordination Practices
Technology can fundamentally restructure roles and interaction patterns. When internet-based car sales provided customers with pricing information previously available only to dealers, it changed the scripts and power dynamics of sales conversations. When surgical robots altered the physical workspace, they affected who could see and interact with whom, sometimes reducing nurses' work visibility and status.
The use of abstract data representations can create coordination challenges. When team members view work primarily through quantitative dashboards rather than shared physical spaces or direct communication, developing shared situational understanding becomes more difficult. Trust—essential for coordination in uncertainty—must extend beyond human colleagues to encompass technology and data themselves.
Implications
Whether technology enhances or impairs social and relational work aspects depends on system design, task characteristics, and usage patterns. Platforms can be structured to foster community or leave workers atomized. Video collaboration tools can be used to strengthen relationships or simply to surveil remote workers. The appropriate balance between technology-mediated and face-to-face interaction varies by work type—routine task coordination may work well virtually; cross-disciplinary problem-solving may require richer communication.
Organizations should consider social architecture when implementing technology: designing opportunities for interaction, ensuring access to social support, maintaining visibility of colleagues' work, and preserving informal communication channels. Training in effective use of collaboration technologies—understanding their affordances and limitations—can help workers leverage digital tools while mitigating disconnection risks.
Job Demands
While the previous sections focused on job resources (characteristics that help achieve goals, cope with demands, and support development), technology also affects job demands—work aspects requiring sustained effort and associated with strain.
Shifts in Cognitive and Physical Demands
Automation typically reduces physical demands by eliminating heavy manual labor, though increased sedentary computer work creates musculoskeletal and health concerns. Cognitive demands shift in complex ways. Automation of routine tasks can increase cognitive challenge when remaining work requires judgment and problem-solving. However, reducing humans to system monitors creates the paradoxical demand of sustained vigilance—maintaining attention when little happens, yet responding rapidly when rare but critical events occur. This is cognitively difficult and inherently fatiguing.
Workload Variability
Automated systems can create feast-or-famine workload patterns. During normal operations, demands are low (monitoring). During system failures or exceptional conditions, demands surge as humans must diagnose problems, take manual control, and coordinate recovery—often under time pressure with imperfect information. Such variability strains adaptation capacity.
Administrative and Compliance Burdens
Technologies ostensibly designed to reduce administrative work can paradoxically increase it. Healthcare providers report that electronic health record systems add documentation time without proportionally reducing other work. When organizations implement self-service technologies (employees booking their own travel, managing their own HR transactions), administrative tasks are devolved to workers, often proving more demanding and less aligned with professional identities than anticipated.
A multinational manufacturer introduced an enterprise resource planning system requiring employees to self-administer travel bookings, accommodation, leave requests, personal data updates, and expense claims. Despite promises of efficiency, many employees found the system cumbersome and time-consuming. Professionals accustomed to focusing on their core work resented becoming their own administrators. The system's complexity required frequent help-desk contacts. Employees perceived the technology as imposing bureaucratic demands incompatible with their roles, creating frustration despite the system's intent to streamline processes.
Surveillance and Performance Monitoring Demands
Perhaps the most significant technology-related demand increase involves surveillance—the continuous tracking, measurement, and evaluation of work behavior. Sensors, big data analytics, and algorithmic management enable monitoring at unprecedented granularity and intensity. Warehouse workers have movement tracked minute-by-minute; drivers have routes, speeds, and behaviors logged; knowledge workers have emails, keystrokes, and work patterns analyzed.
This "electronic performance monitoring" can provide useful feedback but often creates stress through constant scrutiny and the perception of distrust. The distinction between monitoring for improvement versus monitoring for control is critical. When workers perceive surveillance as punitive rather than developmental, negative effects on wellbeing and performance follow.
Algorithmic management combines automation of decision-making with intensive monitoring. Gig workers may be rated on each task, have their locations continuously tracked, receive algorithmically generated performance warnings, and face "deactivation" (termination) based on opaque algorithmic criteria. Such systems create demands for constant attention to metrics, adaptation to unpredictable algorithmic decisions, and emotional labor to manage the stress of precarious, monitored work.
Implications
Managing job demands in digitally mediated work requires thoughtful system design and organizational policies. Cognitive demands should be matched to human capabilities—avoiding both understimulation (prolonged monitoring) and overload (excessive information or task volume). Automation strategies should consider demand variability, ensuring humans maintain skills and situational awareness needed for high-demand periods.
Administrative technologies should genuinely reduce rather than devolve burden. Performance monitoring should serve developmental purposes, implemented transparently with employee participation, rather than functioning primarily as control. Research suggests that monitoring is more acceptable and effective when used for coaching, when workers have input into what is measured, and when it is limited to performance-relevant behaviors rather than extending to pervasive surveillance.
Evidence-Based Organizational Responses
Table 1: Impact of Digital Technologies on Work Design Dimensions
Technology Type | Work Design Dimension | Positive Impact/Opportunity | Negative Impact/Risk | Shaping Factors | Illustrative Example | Intervention Strategy |
Artificial Intelligence / Machine Learning | Autonomy and Decision-Making | Increases autonomy by distributing knowledge and real-time data to empower localized decisions. | Reduces autonomy through "black box" control; workers become passive supervisors of systems they do not understand (out-of-the-loop problem). | Technology opacity, management philosophy, degree of human-centered design, and task complexity. | AI-driven credit scoring or applicant tracking systems replacing human judgment. | Maintain transparency; provide override capabilities; design systems to augment rather than replace judgment. |
Algorithmic Management | Autonomy and Job Demands | Provides clear schedules and visit sequences without the need for manual planning. | Rigid time allocations; tension between professional judgment and automated control; surveillance-based stress. | Clarity of metrics (compliance vs. quality), transparency of decision logic, and presence of appeal channels. | Tablet-based scheduling for home care nurses in a regional health system. | Joint sociotechnical optimization; involving workers in setting parameters; focusing feedback on development rather than punishment. |
Platform Economy / Gig Tech | Autonomy and Social Connection | Potential for flexible work arrangements and work-life integration; external forums allow for mutual assistance. | Illusory flexibility; soft controls (surge pricing, ratings) create pressure; algorithmic isolation with no human support. | Labor laws, institutional regimes, platform social architecture, and worker negotiating power. | Uber drivers facing pressure to work peak times; food delivery riders penalized by rating systems. | Macro-level policies on employment classification; structuring platforms to foster community/teams. |
Robotics (Surgical/Collaborative) | Skill Variety and Use | Eliminates dangerous or repetitive tasks; allows focus on high-level system oversight. | Deskilling and lack of experiential learning for trainees; deterioration of manual skills due to lack of practice. | Work allocation choices by senior staff, liability concerns, and feedback mechanisms (tactile vs. visual). | Da Vinci robotic surgery system reducing resident opportunities for hands-on practice. | Complementarity-based function allocation; preserving "at the edge of competence" practice for trainees. |
Automation (General Industrial) | Job Feedback and Skill Variety | Devolution of real-time production info allows operators to resolve problems quickly. | Passive monitoring leads to disengagement, vigilance fatigue, and skill atrophy. | Implementation strategy ("informate" vs. "automate"), operational uncertainty, and user participation. | Chemical plant operators maintaining control over variance management vs. passive monitoring. | Participatory design; training that includes work redesign; regular work design audits. |
Information Systems (ERP/Documentation) | Job Demands | Streamlining processes and creating centralized data access. | Increased administrative load; devolving tasks to professionals (self-service) that clash with core identity. | System configurability, user-centered design, and organizational culture. | Multinational manufacturer employees spending excessive time on self-administered travel/HR bookings. | User-centered design in procurement; ensuring tech reduces rather than shifts the burden. |
Communication Technologies (Remote Work) | Social and Relational Aspects | Maintains connection across distances; buffers against isolation through social media. | Erosion of social cues (facial expressions); reduced trust and coordination compared to co-location. | Organizational social architecture, industry norms, and individual digital literacy. | Virtual teams struggling with conflict resolution; teleworkers feeling pressure for constant availability. | Training in collaboration technology; preserving informal communication channels and face-to-face interaction. |
Strategy 1: Proactive Work Design During Technology Implementation
The foundational principle of sociotechnical systems thinking—joint optimization—holds that technical systems and work organization should be designed together. This requires considering work design implications explicitly during technology procurement, implementation, and adaptation.
Participatory Design and Implementation
Involving workers in technology design and implementation improves outcomes by incorporating their frontline knowledge, enhancing acceptance, and enabling real-time adaptation. When operators helped design advanced manufacturing systems, choosing which tasks to automate and which to retain human control over, implementations achieved better performance and satisfaction than when engineers made such decisions unilaterally.
Participatory approaches include:
User involvement in requirements specification: Workers contribute knowledge about actual work processes, variability, and coordination needs, improving system design
Prototype testing and iteration: Early trials with actual users reveal work design implications, allowing refinement before full deployment
Training that includes work redesign discussion: Rather than teaching workers to use predetermined systems, training explores how work might be restructured around new technology
Post-implementation review and adaptation: Regular assessment of work design effects enables ongoing optimization
A chemical plant implementing advanced process control automation gave operators choice about automation levels for different aspects of work. Operators retained control over troubleshooting and variance management—tasks requiring situational judgment—while routine parameter adjustments were automated. This preserved meaningful work while reducing physical demands and improving consistency. Operators reported higher job satisfaction and maintained skills needed to intervene during system failures. Plant performance exceeded expectations compared to similar facilities where automation was more complete but operators felt disengaged.
Function Allocation: Complementarity Rather Than Substitution
A critical implementation choice is function allocation—deciding which tasks humans perform and which are automated. Traditional "left-over function allocation" automates everything possible, leaving humans with residual tasks—often monitoring roles that are cognitively demanding yet provide little meaning or skill use.
Alternative approaches emphasize complementarity: allocating functions based on relative human and machine strengths. Humans excel at pattern recognition in novel contexts, moral reasoning, empathetic communication, and adaptation to unanticipated situations. Machines excel at precision, speed, consistency, and handling of explicitly defined algorithms. Effective allocation capitalizes on these complementary strengths rather than viewing humans as gap-fillers for automation's limitations.
This might mean:
Automating data collection and initial analysis while preserving human judgment about interpretation and action
Using algorithmic recommendations as decision support rather than replacing human decision-making
Maintaining manual backup systems and regular skill practice even when automation handles routine operations
Designing roles that combine automated and manual elements to sustain engagement and capability
Ongoing Monitoring and Adjustment
Technology implementation is not a one-time event but an ongoing process as systems evolve, users adapt, and unintended consequences emerge. Regular assessment of work design effects enables continuous improvement:
Work design assessments: Systematic evaluation of how technology affects autonomy, skill use, feedback, social connection, and demands
Worker feedback mechanisms: Channels for employees to report work design problems and suggest improvements
Performance monitoring that includes human outcomes: Tracking not only productivity and efficiency but also employee wellbeing, learning, and safety
Iterative refinement: Willingness to modify systems or work organization when initial designs prove problematic
Strategy 2: Human-Centered Technology Development and Procurement
While organizational work design choices matter enormously, the technology itself—its capabilities, interface design, configurability, and underlying logic—also powerfully shapes work. Human-centered design principles should inform technology development, not just implementation.
Principles of Human-Centered Automation
Research in human factors and ergonomics has developed extensive principles for automation that supports rather than undermines human performance:
Transparency: Systems should be comprehensible—users can understand current system state, how it reached that state, and what it will do next. Opaque "black box" algorithms undermine human oversight and trust.
Predictability: System behavior should be sufficiently consistent that users can anticipate it and plan accordingly. Unexpected "automation surprises" impair human supervision.
Controllability: Humans should be able to influence or override automated functions when needed. This includes manual backup modes and clear procedures for taking control.
Feedback preservation: Automation should maintain sensory feedback that supports situational awareness. Excessive abstraction of information or removal of tactile feedback impairs human understanding.
Appropriate automation level: The degree of automation should match task characteristics, operational uncertainty, and human supervision capabilities. More automation is not always better.
These principles must be adapted for contemporary AI and machine learning systems, where transparency and predictability are inherently limited by systems' complexity and adaptive nature. This raises profound questions about human oversight and accountability.
Aircraft manufacturers Airbus and Boeing adopted different automation philosophies. Airbus systems use side-stick controllers without tactile feedback between co-pilots' controls, emphasizing optimized automated flight. Boeing systems retain interconnected yokes that provide tactile feedback of the other pilot's actions, preserving awareness of manual inputs. Both approaches have merits, but the feedback-preserving design arguably better supports coordination and situational awareness when manual intervention is needed—illustrating how design details profoundly shape human-automation interaction.
User-Centered Design in Technology Development
Rather than treating human factors as an afterthought (evaluating completed systems to identify problems), user-centered design integrates human considerations throughout development:
Early user research: Understanding actual work practices, goals, constraints, and variability before designing systems
Iterative prototyping with user testing: Developing progressive versions with feedback from representative users
Attention to exceptional conditions: Designing not only for normal operations but for error recovery, system failures, and unusual situations where human intervention is critical
Configurability: Building flexibility so systems can be adapted to different contexts, user preferences, and evolving needs
Technology procurement should include work design criteria alongside functionality and cost considerations:
Evaluating work design implications: Assessing how candidate systems affect autonomy, skill use, feedback, etc.
Requiring human-centered documentation: Vendors should specify design principles, user involvement in development, and human factors testing
Pilot testing with work design focus: Trials should evaluate effects on actual work practices, not just technical performance
Including workers in procurement decisions: Those who will use systems should help evaluate alternatives
Design for Job Quality, Not Just Efficiency
A fundamental shift is needed in technology development priorities. Current incentives overwhelmingly favor efficiency—speed, cost reduction, labor substitution. While efficiency matters, it should not be the sole criterion. Technology should also be designed for:
Learning and skill maintenance: Providing opportunities for practice, feedback, and progression
Meaningful work: Supporting task identity, significance, and variety
Autonomy preservation: Enabling rather than constraining human judgment
Social connection: Facilitating rather than replacing human interaction
Wellbeing: Considering physical ergonomics, cognitive load, emotional demands, and psychological safety
Some of these goals align with efficiency; others may involve trade-offs requiring explicit value judgments about organizational priorities and societal goals.
Strategy 3: Macro-Level Policies Supporting Better Work Design
While organizations make micro-level implementation choices and developers make design choices, broader policy frameworks shape the environment in which these decisions occur. Regulation, labor institutions, professional standards, and economic incentives all influence technology's ultimate effects on work quality.
Regulatory Approaches
Various regulatory initiatives address technology's workforce implications:
Data protection and privacy: Regulations like Europe's GDPR limit collection and use of personal data, with implications for workplace monitoring and algorithmic decision-making
Algorithmic transparency and accountability: Emerging requirements that algorithmic decisions affecting individuals (employment, credit, public services) be explainable and contestable
Employment classification: Policies determining whether platform workers are employees or independent contractors, affecting their rights and protections
Workplace health and safety: Standards addressing digital work risks, including psychosocial hazards like excessive monitoring or unpredictable algorithmic management
The UK Engineering and Physical Sciences Research Council established principles including "Humans, not robots, are responsible agents; robots are tools designed to achieve human goals" and "The transparency principle: it should be possible to find out who is responsible for decisions made by robots." While aspirational, such principles signal policy orientation toward maintaining human accountability and requiring explicability—countering purely technocentric approaches.
Labor Institutions and Collective Voice
Worker representation through unions, works councils, and professional associations influences technology implementation. In some European countries, workers' councils have formal consultation rights regarding technology introduction, creating opportunities to negotiate work design implications. Collective bargaining can establish:
Participation rights in technology decisions: Worker voice in selection, implementation, and evaluation of new systems
Protections against excessive monitoring: Limits on surveillance intensity, data collection scope, or use of monitoring data
Training and skill development provisions: Resources to help workers adapt to technological change
Transition support: Assistance for workers displaced by automation
The effectiveness of these mechanisms varies by country, industry, and institutional strength. Where worker voice is weak, technology tends toward employer-favoring outcomes.
Professional Standards and Certification
Professional bodies in fields like engineering, information systems, and human factors can promote work design considerations through:
Educational requirements: Including sociotechnical thinking, work design, and human-centered design in curricula
Professional ethics codes: Establishing obligations to consider human and organizational impacts of technology
Certification standards: Requiring demonstration of human-centered competencies
Best practice dissemination: Publishing guidelines, case studies, and tools for effective sociotechnical design
Economic Incentives and Public Procurement
Government and large organizations can use procurement power to favor technologies designed with work quality in mind:
Including work design criteria in technology procurement: Evaluating vendors on human factors, not just functionality and cost
Funding for workplace innovation: Supporting research and implementation of sociotechnical approaches
Tax or regulatory incentives: Favoring organizations that demonstrate commitment to job quality alongside productivity
Public discourse and media attention also shape technological development. When privacy violations, algorithmic bias, or exploitative platform work practices receive critical scrutiny, it creates pressure for改革. Conversely, uncritical celebration of technological "disruption" without examining work consequences enables problematic developments.
Strategy 4: Education and Training Beyond Digital Skills
The dominant response to technological change emphasizes "upskilling" workers—providing training in digital competencies, data literacy, coding, AI fundamentals, and continuous learning mindsets. These efforts are important but insufficient. Three additional educational priorities are needed.
Teaching Work Design Literacy to Multiple Stakeholders
Currently, work design receives minimal attention in most professional education. Business schools emphasize strategy, finance, and marketing; engineering schools focus on technical capabilities; information systems programs prioritize software development. Work design, when addressed at all, appears briefly in organizational behavior courses as one among many topics.
This neglect means that those designing, implementing, and procuring technology—engineers, operations managers, consultants, executives—lack frameworks for considering work design implications. Education should include:
For technology designers and engineers: Understanding human capabilities, limitations, needs; principles of human-centered design; sociotechnical systems thinking
For managers and consultants: Recognizing work design as a strategic variable; understanding links between work characteristics and outcomes; skills for participatory design processes
For executives and decision-makers: Appreciating trade-offs between efficiency and job quality; understanding long-term performance benefits of good work design; skills for influencing technology vendors and policies
Practical tools matter as much as conceptual knowledge:
Assessment instruments: Methods for evaluating work design quality
Design frameworks: Structured approaches for considering work design during technology projects
Case studies: Examples of successful sociotechnical implementations and cautionary tales of failures
Several business schools have introduced "work design and organizational architecture" as core MBA courses rather than elective topics. These programs teach students to analyze how organizational structure, technology, and work design jointly shape performance and wellbeing. Projects involve redesigning actual organizational systems. Graduates report that this training proves unexpectedly valuable when they encounter technology implementation challenges in their careers—they possess frameworks lacking among peers educated solely in traditional management disciplines.
Educating Workers About Job Crafting and Work Design
Workers themselves can shape their work experiences through job crafting—proactive, self-initiated behaviors to modify tasks, relationships, or cognitive perspectives to better align jobs with interests and strengths. Meta-analyses show that job crafting improves work satisfaction, engagement, and performance.
In technology contexts, job crafting might include:
Experimenting with ways to use systems that preserve autonomy and skill use
Seeking opportunities for manual practice even when automation handles routine tasks
Creating informal communication channels to supplement limited platform-mediated interaction
Reframing monotonous technology-enabled work by connecting it to broader purposes
Training workers in work design principles and job crafting strategies enhances their agency in adapting to technological change. Rather than passive recipients of imposed systems, workers become active shapers of sociotechnical arrangements.
Educating Policy Makers, Media, and Citizens
Technology's ultimate trajectory reflects societal choices mediated through policy, investment, and public discourse. Broader education about work design issues can inform these debates:
For policy makers: Understanding how regulation can protect job quality while enabling innovation
For journalists and media: Framing technology stories to examine work implications, not just technical capabilities or business impacts
For investors and business leaders: Recognizing links between job quality and organizational sustainability
For citizens: Building understanding of work quality as a legitimate criterion for evaluating technological change, not just private consumption benefits or economic growth
Public discourse too often accepts technological determinism—treating technology's effects as inevitable rather than shaped by choices. Education can shift these narratives toward recognition that technology futures are contested and shapeable.
Building Long-Term Organizational Capability and Resilience
Cultivating an Ongoing Work Design Orientation
Beyond specific interventions, organizations benefit from embedding work design considerations into ongoing practices:
Work design audits: Periodic systematic assessment of how current work systems affect employees and performance
Technology impact assessments: Requiring that proposals for new technology include analysis of probable work design effects
Worker participation structures: Standing mechanisms (committees, forums, feedback systems) through which employees contribute to work design decisions
Manager development: Making work design part of leadership competency frameworks, with training and accountability
Visible leadership commitment: Executives articulating that job quality matters alongside efficiency, including work design criteria in strategic planning
Distributed Leadership and Ownership
Effective sociotechnical systems often feature distributed rather than hierarchical control—recognizing that those closest to work processes possess crucial knowledge for managing variances and adapting to change. This argues for leadership structures that:
Empower frontline workers and teams to make decisions and shape their work
Treat worker expertise as an asset to leverage rather than a constraint to overcome
Create accountability for human and technical outcomes jointly, not just technical performance
Encourage continuous experimentation and learning rather than fixed "optimized" designs
Purpose, Meaning, and Psychological Contracts
Technology changes not just tasks but also psychological contracts—workers' beliefs about mutual obligations with employers. When automation displaces colleagues, increases monitoring, or deskills work, it may violate implicit expectations of job security, trust, and meaningful work.
Organizations should proactively address these psychological contract shifts:
Communicating openly about technology's purpose and anticipated effects
Involving workers in decisions affecting their work and colleagues
Providing transition support and alternative opportunities for displaced workers
Maintaining commitment to job quality even when pursuing efficiency
Connecting technology changes to broader organizational purpose and values
When workers understand why technology is being adopted and see that their needs are considered, not just overridden, acceptance and effective adaptation increase.
Data Stewardship and Algorithmic Governance
As algorithmic management and workplace analytics proliferate, organizations need governance frameworks addressing:
Data ethics: What worker data may be collected, for what purposes, with what protections?
Algorithm transparency: How can workers understand what algorithmic systems measure and how decisions are made?
Review and appeal: What mechanisms exist to contest algorithmic judgments?
Accountability: Who is responsible when algorithmic systems make problematic decisions?
Ongoing monitoring: How do organizations track algorithmic system effects on workers and performance over time?
These questions lack simple answers but require explicit attention rather than defaulting to technically possible data collection and algorithmic control simply because they are feasible.
Conclusion
Digital technologies—AI, robotics, algorithmic management, platform business models—are fundamentally reshaping work. The critical question is not whether these technologies will transform work, but how—and specifically, whether technological change will enhance or degrade work quality and human wellbeing.
This article proposed work design as the crucial lens through which to understand and shape technology's effects. Decades of research demonstrate that work characteristics—autonomy, skill variety, feedback, social connection, and demands—profoundly affect motivation, learning, health, safety, and performance. Technology alters all of these characteristics, yet its effects are neither simple nor predetermined. The same technology can increase or decrease job autonomy depending on implementation choices; automation can enhance or erode skill use depending on function allocation; algorithmic systems can provide developmental feedback or create oppressive surveillance depending on design and governance.
These contingent effects reflect the interplay of technology characteristics, organizational contexts (management philosophy, work methods, employee participation), individual responses (skill, trust, adaptation), and broader forces (occupational structures, institutional regimes, policies). Understanding this complexity requires moving beyond technological determinism toward sociotechnical thinking that considers human and organizational systems alongside technical capabilities.
Four complementary intervention strategies can steer technological change toward positive work outcomes:
Proactive work design during technology implementation, applying sociotechnical principles of joint optimization, participatory design, complementarity-based function allocation, and ongoing monitoring
Human-centered technology development and procurement, embedding transparency, controllability, and job quality considerations into technology itself
Macro-level policies including regulation, labor institutions, professional standards, and economic incentives that protect work quality while enabling innovation
Education and training that extends beyond digital upskilling to teach work design literacy to technology creators, implementers, managers, workers themselves, and policy makers
Ultimately, we must shift from a reactive posture—helping humans adapt to whatever technology brings—to a proactive stance where technology and work are jointly designed to serve human needs and organizational goals. This requires recognizing that technology development reflects choices, not inevitable trajectories, and that work quality is a legitimate criterion alongside efficiency and innovation.
The future of work is not technologically determined. It will be shaped by the millions of design decisions made by engineers, managers, consultants, workers, and policy makers—decisions about which tasks to automate, how much control to preserve for humans, how to structure roles around new systems, what feedback to provide, how intensely to monitor, and whose interests to prioritize. Work and organizational psychologists, with deep knowledge of human capabilities, needs, and the characteristics of effective work, should be at the forefront of these discussions. The goal is not to resist technological change but to ensure that as technology advances, work becomes more rather than less conducive to human flourishing and organizational effectiveness.
Research Infographic

References
Arntz, M., Gregory, T., & Zierahn, U. (2016). The risk of automation for jobs in OECD countries: A comparative analysis. OECD Social, Employment and Migration Working Papers, No. 189, OECD Publishing.
Autor, D. H., & Dorn, D. (2013). The growth of low-skill service jobs and the polarization of the US labor market. American Economic Review, 103(5), 1553–1597.
Autor, D. H., Levy, F., & Murnane, R. J. (2003). The skill content of recent technological change: An empirical exploration. Quarterly Journal of Economics, 118(4), 1279–1333.
Bainbridge, L. (1983). Ironies of automation. Automatica, 19(6), 775–779.
Bakker, A. B., & Demerouti, E. (2007). The job demands-resources model: State of the art. Journal of Managerial Psychology, 22(3), 309–328.
Barley, S. R. (1986). Technology as an occasion for structuring: Evidence from observations of CT scanners and the social order of radiology departments. Administrative Science Quarterly, 31(1), 78–108.
Barley, S. R., & Kunda, G. (2001). Bringing work back in. Organization Science, 12(1), 76–95.
Beane, M. (2018). Shadow learning: Building robotic surgical skill when approved means fail. Administrative Science Quarterly, 64(1), 87–123.
Beane, M., & Orlikowski, W. J. (2015). What difference does a robot make? The material enactment of distributed coordination. Organization Science, 26(6), 1553–1573.
Bloom, N., Garicano, L., Sadun, R., & Van Reenen, J. (2014). The distinct effects of information technology and communication technology on firm organization. Management Science, 60(12), 2859–2885.
Brougham, D., & Haar, J. (2018). Smart technology, artificial intelligence, robotics, and algorithms (STARA): Employees' perceptions of our future workplace. Journal of Management & Organization, 24(2), 239–257.
Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. W. W. Norton & Company.
Brynjolfsson, E., Mitchell, T., & Rock, D. (2018). What can machines learn, and what does it mean for occupations and the economy? AEA Papers and Proceedings, 108, 43–47.
Clegg, C. W. (2000). Sociotechnical principles for system design. Applied Ergonomics, 31(5), 463–477.
Cordery, J. L., Morrison, D., Wright, B. M., & Wall, T. D. (2010). The impact of autonomy and task uncertainty on team performance: A longitudinal field study. Journal of Organizational Behavior, 31(2–3), 240–258.
Dellot, B., & Wallace-Stephens, F. (2017). The age of automation: Aritificial intelligence, robotics and the future of low-skilled work. RSA Action and Research Centre.
Frey, C. B., & Osborne, M. A. (2017). The future of employment: How susceptible are jobs to computerisation? Technological Forecasting and Social Change, 114, 254–280.
Grant, A. M., & Parker, S. K. (2009). Redesigning work design theories: The rise of relational and proactive perspectives. Academy of Management Annals, 3(1), 317–375.
Grote, G. (2009). Management of uncertainty: Theory and application in the design of systems and organizations. Springer-Verlag.
Grote, G. (2014). Adding a strategic edge to human factors/ergonomics: Principles for the management of uncertainty as cornerstones for system design. Applied Ergonomics, 45(1), 33–39.
Grote, G., Ryser, C., Wäfler, T., Windischer, A., & Weik, S. (2000). KOMPASS: A method for complementary function allocation in automated work systems. International Journal of Human-Computer Studies, 52(2), 267–287.
Hackman, J. R., & Oldham, G. R. (1976). Motivation through the design of work: Test of a theory. Organizational Behavior and Human Performance, 16(2), 250–279.
Humphrey, S. E., Nahrgang, J. D., & Morgeson, F. P. (2007). Integrating motivational, social, and contextual work design features: A meta-analytic summary and theoretical extension of the work design literature. Journal of Applied Psychology, 92(5), 1332–1356.
Karasek, R. A. (1979). Job demands, job decision latitude, and mental strain: Implications for job redesign. Administrative Science Quarterly, 24(2), 285–308.
Kellogg, K. C., Valentine, M. A., & Christin, A. (2020). Algorithms at work: The new contested terrain of control. Academy of Management Annals, 14(1), 366–410.
Lehdonvirta, V. (2018). Flexibility in the gig economy: Managing time on three online piecework platforms. New Technology, Work and Employment, 33(1), 13–29.
Leonardi, P. M., & Barley, S. R. (2010). What's under construction here? Social action, materiality, and power in constructivist studies of technology and organizing. Academy of Management Annals, 4(1), 1–51.
Morgeson, F. P., & Humphrey, S. E. (2006). The Work Design Questionnaire (WDQ): Developing and validating a comprehensive measure for assessing job design and the nature of work. Journal of Applied Psychology, 91(6), 1321–1339.
Parker, S. K., Morgeson, F. P., & Johns, G. (2017). One hundred years of work design research: Looking back and looking forward. Journal of Applied Psychology, 102(3), 403–420.
Parker, S. K., Van den Broeck, A., & Holman, D. (2017). Work design influences: A synthesis of multilevel factors that affect the design of jobs. Academy of Management Annals, 11(1), 267–308.
Rosenblat, A., & Stark, L. (2016). Algorithmic labor and information asymmetries: A case study of Uber's drivers. International Journal of Communication, 10, 3758–3784.
Sergeeva, A., Huysman, M., & Faraj, S. (2015). Transforming work practices of operating room teams: The case of the Da Vinci robot. Proceedings of the International Conference on Information Systems.
Tims, M., & Bakker, A. B. (2010). Job crafting: Towards a new model of individual job redesign. SA Journal of Industrial Psychology, 36(2), 1–9.
Trist, E. L., & Bamforth, K. W. (1951). Some social and psychological consequences of the longwall method of coal-getting. Human Relations, 4(1), 3–38.
Wall, T. D., Corbett, J. M., Martin, R., Clegg, C. W., & Jackson, P. R. (1990). Advanced manufacturing technology, work design, and performance: A change study. Journal of Applied Psychology, 75(6), 691–697.
Waterson, P., Robertson, M. M., Cooke, N. J., Militello, L., Roth, E., & Stanton, N. A. (2015). Defining the methodological challenges and opportunities for an effective science of sociotechnical systems and safety. Ergonomics, 58(4), 565–599.
Zuboff, S. (1988). In the age of the smart machine: The future of work and power. Basic Books.

Jonathan H. Westover, PhD is Chief Research Officer (Nexus Institute for Work and AI); Associate Dean and Director of HR Academic Programs (WGU); Professor, Organizational Leadership (UVU); OD/HR/Leadership Consultant (Human Capital Innovations). Read Jonathan Westover's executive profile here.
Suggested Citation: Westover, J. H. (2026). Automation, Algorithms, and Beyond: Why Work Design Matters More Than Ever in a Digital World. Human Capital Leadership Review, 34(1). doi.org/10.70175/hclreview.2020.34.1.2






















