menu Menu
Think & Act Ahead: A handbook on how to future-proof your organisation
An Academic Research on the Relation of HR and New Technologies and Trends 
By Akram Alsafi Posted in Non-fiction 30 min read
VALAKOR: The Great Occult War Previous A Deal with the Dark Fae (Cruel Intentions Book 1) Next



This chapter reveals the alignment of new workplace technologies being introduced and the employment relationships occurring. A combination of existing literature review and qualitative case studies amongst different organisations will be discussed.

Technology impacts employees in various ways. Firstly, work difficulty and intensity are often the outcomes of impacting employees. The majority of technology implemented in the workplace aligns with the expectation that it will increase productivity, which will later be discussed in relation to improving people’s working lives. This means providing useful information/ data and automating tedious/routine tasks to mitigate the physical burden of people’s work practices. However, it goes without saying that we have also seen the flip side of technology. This includes raising employees’ physical burden when conducting their work practices by increasing the work level that needs to be completed. For example, analysing new data heaps, a more intense workload and continuous retraining to keep in line with technological changes and managing cyber security to safeguard the organisation.

Discussion in recent years has wholly focused on how technologies present the replacement of the workforce. For instance, we see Frey & Osborne (2013) noting computerization occurring in the future and forecasting 47% of total U.S. employment is at risk. However, little attention has been focused on the impacts of technology in shaping employment among the workers that remain. This chapter intends not to predict the individual impact that future workplace technology will have. Rather it focuses on how current technologies have shaped the workplace’s nature within the UK over the past decade. This includes communications technology ranging from Skype to email, the increasing speed of tablets and smartphones, and the various forms of monitoring and scheduling tools. Other areas that will be explored include practical applications of organisations leveraging the presence of both virtual reality modelling and wearable technology that have been positioned within the workplace. We will also discuss how organisations manage the coordination between employee relations and the technological changes.

The importance behind preserving human autonomy

While technology in many ways facilitates people’s work practices (i.e. making jobs easier), there are also the potential concerns raised of whether workers would feel dehumanised when doing ‘easier’ jobs. Despite such a legitimate concern being raised, replacing the reliance on human judgment/thinking can also suggest that it may remove employee autonomy, which presents an important element behind workforce wellbeing. The term workforce wellbeing is defined as the “degree to which the job provides substantial freedom, independence and discretion in scheduling the work and determining the procedures to be carried out” (Hackman & Oldham, 1975, p 162). Employee autonomy presents a contributing factor behind fostering employee health and wellbeing, enhancing work motivation, productivity, creativity, and flexibility. In alignment with technological changes, this becomes particularly crucial when collaboration among group members improves technology usage as people can be encouraged and enabled to learn from one another. However, it goes without saying that such benefits can also occur in line with the risk that technology can erode employee autonomy.

Furthermore, with the potential threats to people’s autonomy at work being faced, such concerns are not uncommon. In 1915, the spread of Taylorism or what we describe as ‘scientific management’ faced criticism from contemporaries for turning the worker into a machine/’automation’ (Druru, 1915 pp.195-198). Instead, in recent years, many workplaces have shifted towards a new form known as ‘digital Taylorism,’ which is a big increase in management’s ability to leverage the spread of new technology by breaking down, monitoring, and analysing individual performance tasks. We now see such Taylorist management styles spread across knowledge professions regarded previously as ‘immune/higher skilled’ in this form of segmentation. Digital Taylorism is described as “translating knowledge work into working knowledge through the extraction, codification, and digitalisation of knowledge into software prescripts that can be transmitted and manipulated by others regardless of location” (Hugh Lauder & Philip Brown). From here, we suggest that future productivity gains will mitigate the autonomy and discretion of professionals and line managers, encouraging ‘knowledge work’ to be segmented and a small proportion of workers being granted the ‘permission to think’ with added responsibility for driving the organisation forward. Various examples of organisations presenting a technologically driven Taylorism include Amazon, where a hand-held device is issued to workers in the warehouse with step-by-step instructions to follow on where to walk, number of items to pick up, and what shelf to reach for, while tracking and measuring the time taken to meeting benchmark targets. Other companies follow similar measures such as Deliveroo and Uber, where strict instructions are issued for drivers on delivery destinations after completing their pickup, with drivers being timed across each part of the route against ‘target’ times. Such examples indicate the driver deviation/discretion being discouraged where the technologies in place (i.e. algorithms/devices) provide the degree of oversight and control. When translating such instances across your organisation, the key challenge that remains when implementing algorithmic decision-making processes is to carefully train employees to know when to judge situations, where and when the technology and algorithms are to be leveraged/relied upon, instead of ignoring/moving beyond them in favour of human judgment. Managers need to clarify when the employees should embrace algorithms, as this is crucial to avoid employees feeling unfairly blamed when making automated decisions, and when things go wrong. This comes with generating clear lines of responsibility. If things go wrong when an algorithmic decision is made, it remains possible to identify the person who can be held accountable. The five key principles outlined by Diakopoulos & Friedler (2016), where decision-making algorithms need to exhibit, include:

-Explainability: That those being affected by algorithms need to be aware of the reasons behind the decisions made (i.e. an employee denied a promotion/raise).

-Auditability: That there should be a possibility made for third parties to monitor and review the manner behind the decisions being made and the outcomes that it was leading towards.

-Responsibility: There should be full clarity over who assumes authority over the decision process and can provide redress or make changes when needed.

-Fairness: Those efforts should remove any trace of societal and human biases from the algorithms and for decision making to be checked for potential discriminatory effects.

-Accuracy: That there should be both an identification and investigation for any sources of uncertainty and error in the calculation to reduce inaccuracy.

Regardless of how well the algorithmic process is designed, there is the consideration of management being exclusively reliant on data-driven metrics when assessing employee performance. This will ignore the value of conversation if human interaction is entirely replaced by technology. Doing so risks losing the ability to get the best out of people, particularly if algorithms label people in the form of tracking numbers instead of being labelled by their actual names. While some data analytics and gathering can improve performance management, the human-technology balance is crucial. This means both retaining the ‘human’ side behind people’s practices which reflect the organisation’s core values/overarching philosophy, while leveraging the available technologies to enhance competitive advantage.

The importance behind human relationships:

Is remote working harmful or helpful for employees?

Technology has shaped HR at work by fostering the increasing growth in flexible and remote working arrangements. Home workers make up 13.9% the entire UK workforce, with 86% of British organisations having at least one person working from home (ONS, 2014; BCC, 2015). Much of this growth has been accredited towards the growth of cloud technology. People can access, share, edit and remotely stored files, and teleconference through free software such as Slack and Skype. They can now replicate ‘state-of-the-art’ boardrooms within organisations.

For many people, home-working facilitates the balance between work and home commitments with greater autonomy, thus strengthening the management-employee trust. Such remote arrangements also offer the added benefit of IT support that can rest at home without physically being in an office waiting for a call. Further statistics point to added benefits regarding employee happiness within their work practices where a U.S survey notes that remote workers (7.75 on a 10-point scale) were on average found to be happier than their office based colleagues (6.69) with 52% reported having contacted their direct supervisor only once a day (ACAS, 2013). It seems that issuing flexible policy guideline and the right technologies essential for home working forms the basis to boosting engagement and preventing risks of people feeling cut off from their organisation. Raising such effectiveness and engagement means doing the following:

Implementing home-work boundary management tactics

Fostering manager-employee trust

Effective and consistent communication between managers and employees

Comprehensive training on the different elements behind home-working being made available to both managers and home-workers

Physical vs. Mental Health

Technology, in various ways, has driven significant improvements in health and safety at work. For instance, the use of technology and machinery has automated, or rendered safe through replacing, many highly dangerous tasks and occupations such as industrial welding. Workers are now able to operate away from the heat, noise and toxic fumes produced. This is where the workplace’s fatality rate over the past 20 years has reduced. In 2016 it was under 0.5 per 100,000 compared to 1.1 per 100,000 workers in 1996 (Health & Safety Executive, 2016). One example is Siemens Congleton. They leveraged ‘virtual reality cave’ to plan their workstations’ design to drive significant ergonomic improvements and reduce workers’ risk of back injuries when stretching for highly placed items. For instance, they used the ‘Jack & Jill’, a digital mannequin used to simulate a cell’s operation by using a digital person, that helped identify both the ergonomic output and health and safety. One transformation leader described how it effectively facilitated people’s work practices following implementation, whereby the digital tool communicated about a certain activity that puts a strain on people. One female employee conducting this particular activity disagreed with the tool (i.e. ‘it’s wrong’), insisting that she’s been doing it for years. It was only when she stood on a piece of equipment when picking things up that she accepted that the digital tool was ‘right’.

Generally, such examples indicate a reduction in physically-demanding work in recent years. Yet, we see this in line with the increased mental demands behind people’s work practices. As people’s physical health improves, mental health has deteriorated. This is reflected in statistics indicating more than one in six within the UK’s working-age population hold a common mental disorder, with mental health being a factor contributing largely to lost working days through illness (Adult Psychiatric Morbidity Survey, 2014). This partly reflects the level of human contact and worker autonomy previously discussed and the growing increase of communication speed, with work-related stress being cited as the outcome of overflowing email inboxes. Adding this increased dependence and presence of technology across all levels within the organisation can also drive workers’ stress and anxiety if the technology malfunctions for any reason. This is enhanced if employees don’t have the available technical skills to manage and fix technical problems while completing their work tasks.

In other countries, we see governments have implemented policies to prioritise and safeguard mental and physical health when conducting people’s work practices. One example is the German government recently publishing a future-focused vision of work which recognised that “a shift is taking place from physical to mental demands” and a new code called ‘Occupational Safety & Health 2.0′ was recommended for development. There was an explicit focus towards making it an employer duty to safeguard such demands (i.e. both physical and mental). With the future acceleration of technological trends likely to continue, employers’ need to address and safeguard the pressing focus of mental health at work is crucial (Federal Ministry of Labour & Social Affairs, 2016).

Work and personal life boundaries

With technology being designed and positioned to facilitate people’s work practices, there is also the consideration of the aligned psychological intrusiveness into the personal time outside of work hours. With greater flexibility being fostered, we see technology allowing people to choose when/where they can work such as leaving work early to balance other commitments (i.e. picking up kids from school or making family dinner before completing emails later in the evening). This flexibility is a move away from people being reliant on staying late at the office. Yet, with heightened expectation for responsiveness and availability driving workplace stress an ‘always on’ culture limits people’s ability to relax at home and ‘switch off’. We see this occurring in France with the ‘right to disconnect’ measure issued, whereby employees can negotiate their rights with their employer to ignore their emails/smartphones outside work hours. Nam’s (2014) research in ‘Technology use & work-life balance’ highlights the ambiguity behind facilitating these new communication technologies within the workplace. Despite enabling flexibility and supporting balance with home commitments, psychological intrusion can spread into personal/home life. It can be viewed that there are two types of employees evenly divided. The ‘segmenters’ prefer strong boundaries between home and work life, such as staying late at the office when needed and then going home to switch off completely. The ‘integrators’ prefer weaker boundaries between work and home lives (i.e. going home early and answering emails in the evening). Nam notes the segmenters that feel threatened by laptop/smartphone invasiveness are likely to find it difficult, particularly if they are expected to work in an environment involving productivity and high commitment. Therefore, management recommendations include driving effort “to recognise more specified categories and patterns of worklife balance as a personal preference” (Nam, 2014). Doing so avoids the potential negative outcomes through the workforce assumption that they should either be treated as ‘all segregators’ or ‘all integrators’. It also ensures that people’s different preferences and individual choices are valued and respected, which may also change over time in response to different technologies being developed in the future. In addition, schedule control, and job autonomy weakened the positive association between work contact and sleep problems/distress (Schieman & Young, 2013). Therefore, the need to foster role modelling/management behaviours (i.e. evading an ‘always on’ culture and enabling an engaged workforce) is crucial to mitigating smartphones’ negative effects.

Targeting employee fitness

Technology also offers further opportunities for developing employee health and well-being. It can help to identify and understand how people feel about their work practices in realtime. For instance, we see monitoring technology increasingly being adopted by organisations to track both psychological well-being (i.e. sleep quality and tiredness) and employee physical health and fitness. Such examples include ECG monitors measuring the variability of heart rate as warning indicators of stress and written/verbal cues to indicate the emotional state of employees. We also see more organisations embracing wearable technology where 21% of the working-age population in the UK already adopting some form of wearable technology targeting employee fitness/health purposes, and with over half of employees within the working population happy to welcome and consider wearing a smart device/watch if their data was leveraged by their employers to improve the workplace (i.e. stress levels and working hours) (PWC, 2014; Vision Critical, 2017).

Furthermore, we see one case study occurring with Network Rail partnering with business psychologists to develop a ‘wellbeing snapshot’ and ‘i-resilience’ report for their employees using online self-assessments, supporting managers to pinpoint the areas vulnerable to workplace stress and enabling Network Rail to take immediate action where necessary in supporting employees overall. Such evidence-based practice indicates how organisations are increasingly taking steps to safeguard employee fitness/health.

Despite the positive impact of technologies on workforce wellbeing, there is some wariness about how the data/information will be leveraged and used for driving overall benefit to the organisation. Therefore, we must draw serious consideration of how data will be stored and used. For instance, we must be aware of the legal implications surrounding data management with mental and physical health likely to be placed as ‘sensitive personal data’ under the Data Protection Act (1998).

There is the risk of employers rushing to implement wearable technology without taking the time to gauge the privacy implications. For instance, a cross-cultural survey of IT decisionmakers in both the UK and US revealed 20% of employees cited privacy concerns as the reason behind not welcoming wearable technology such as the GENEActiv accelerometer (ACAS, 2013; Rackspace, 2014). Other risks include employees treating wearable technology as an expectation, with added pressure to maintain their mental and physical health. Suppose organisations go further and penalise their employees when their well-being and fitness fall below expectations hardly an ideal outcome. Imagine a scenario where an organisation concludes: “actually John is slightly overweight and at risk of a heart attack, as we can see his stress levels, blood pressure and levels of activity.”This could lead to misusing the data and mismanaging his entire career if the organisation feels that he is going to be a burden on them. Therefore, acceptable boundaries of trust between the employee-employer relationship is crucial whereby How far is too far?’ is a question an organisation can never stop asking or respecting” (PwC, 2014)

This could be extended across recruitment processes, such as ‘biometric CV’s’ being implemented within the job application process containing data on responses to stress, fitness and sleeping patterns collected by the wearable watch technology. Such technologies offer valuable benefits/applications. However, considerations should be given to the potential range of discrimination claims opened up against employers if candidates feel unfairly penalised at the expense of their physical/mental health.

While, the implementation of wearable tech has been designed to streamline/improve employee health and improve people performance, there is the question of deciding who manages the data (i.e. is it the employer? Or the employee?). Again this goes back to the previous discussion regarding the importance of trust between the employee-employer relationship. There is also the risk of mismanaging the people relations and workforce engagement when overusing these wearable technologies. Therefore, the need to balance optimising business performance (i.e. implementing data-driven devices) with retaining people satisfaction/productivity (i.e. safeguarding people’s mental/physical health and fitness) is crucial to future-proofing a successful organisation. Doing so will mitigate the risks of heightened expectation that comes with adopting wearable tech.

The rapid pace of technological change

With technological advancement accelerating, one question that can be raised is how quickly should organisations respond when implementing new technology? There could be a conflict between the likely preferences of the younger generation to go quickly and the older generation to adopt a more cautious approach. Managing both sets of generations within the organisation requires a measured approach for effective implementation. This comes with executing the change itself through careful preparation and planning at pace, before implementing the new technology while maintaining workforce enthusiasm and avoiding employees’ potential to feel disillusioned/impatient. In other words, striking a balance between intending a timed delivery of technology and having the capability and the projects held at the right maturity level so the technology will effectively work. Following technology implementation, the change process continues even after the technology becomes operational. It is crucial to consistently monitor the short and long-term impacts and evaluate the overall effectiveness that the technology brings.

Furthermore, Willcocks (2013) notes that most modern organisations often lack the ability to quickly implement such change, partly due to current work practices being already intensified to the stage where less time is made for enabling forward-thinking or a ‘future-proof mindset’. This is where the speed of technological advancement and resource short ages within organisations are cited as inhibiting such forward thinking. He also added that the pace behind technological trends indicate that:

“People have transformational fatigue at the moment they’ve seen too much change, and they want to slow down. I think organisations’ capacity to change has slowed down. I think organisations are going so fast just to cope with the uncertainty around and so focused on just getting today’s work done that, although everyone uses the rhetoric of change and innovation, their actual organisation’s capability to change is slowing down”.

The general understanding of managing the pace of technological change seems to promote a ‘carefully prepare, quickly implement’ approach. Yet organisational ability to consistently meet such demands will likely come under increasing strain if workplace technologies continuously advance faster over the next decades. This is where the theoretical capabilities behind workplace technology may further advance organisations’ practical ability to adapt and successfully incorporate the technology. We see this where Willcocks (2013) adds the argument that the theoretical capabilities of technology within ‘future of work’ academic research fails to account for the speed of transition, with two assumptions:

The technology is very perfectible very quickly, which it isn’t “

“They assume organisations’ capacity to change is instant, which it isn’t. If you make these false assumptions, then you make statements like ‘we’re all going to be automated by 2020.”

Understanding the need for new technology

Incorporating new technology into the workplace is essential to sustaining competitiveness and driving operational efficiency within the organisation. However, consideration is still needed to pinpoint the precise value it adds to business needs and whether they also fit with both the organisational and human context. CIPD (2014) notes that areas for consideration to initially factor before deciding what new technology needs to be incorporated include:

  • Assessing the type of organisation
  • Assessing the type of employees
  • Assessing the context
  • Assessing the technology itself

Enabling a clear, actionable plan with clear objectives is the basis of fostering a successful approach to gauge and fully understand the business needs. When taking on new technology, a positive approach includes conducting a quick, low budget pilot to form a set of clear, non-technology-based outcomes regarding leadership connectivity, engagement, speed of decision-making, and authenticity from people’s perspective. Doing so will likely to efficiently guide towards achieving simple organisational goals, with technology enabling this.

There is also the lipside of organisations failing to consider people’s needs in alignment with achieving business objectives.

This could often occur when businesses may look at technology as the end objective for ‘easing’ the back office processes without accounting for the employee/customer, with the likelihood of going slower than needed. This includes designing systems/software that is focused around building a people experience. Having a clear business need for the technology before understanding and accounting for employee/customer needs is crucial. This clarity forms the defining characteristic of success, instead of whether it is deemed ‘cool or not’.

Communicating the need for new technology.

As the organisation is clear about the technological changes entering the workplace, the next challenge is convincingly communicating the needs, objectives and plans to employees. This means offering them reassurances regarding the potential anxieties they may hold of technology  replacing  their jobs in the short/long-term. For some organisations such as Siemens Congleton, carefully implementing technology to efficiently streamline operations meant that potential employee concerns regarding potential redundancy were effectively and proactively managed with the awareness that such increasing demands may not last for long. Allaying such concerns was crucial to ensuring technological change successes and widely maintaining the service quality and productivity.

Furthermore, technology can also tailor a personalised approach when managing and communicating change for individual workers. This includes leaders personalising the level of engagement, training and real-time messaging of individuals with information about how the technology can help perform their roles. Such examples include providing employees with digital training packages based on forecasting to interact with technology. However, it goes without saying that technology should not be treated as an entire replacement for traditional face-to-face interactions, particularly when handling difficult conversations with employees affected by workplace technology changes. Therefore, developing the ability to lead through times of change is crucial.

The role of staff involvement

One key consideration within technological change management is the need for a two-way employee-employer communication exchange to occur. As mentioned earlier, while it is crucial to enable transparency and clear intentions in management plans, it is also crucial to gain employee feedback across every stage of the technological change process and encourage involvement/inclusivity, instead of employees being passively subjected to it. This is where concerns can emerge if there is a lack of engagement and a high potential for change resistance among employees. Examples of staff engagement include consistent staff consultations.

Often with the implementation of technology changes within organisations, we see the involvement of trade unions and employee representatives who offer practical support and advice in situations where jobs might be at risk. This is where sustaining staff involvement during consultations allows transparency and justification behind decision-making. It also allows employee courage where ‘a climate of trust offers employees with the confidence to participate fully’ (Godwyn and Gittell, 2011). This became successful in Germany where they made recommendations from an Industry 4.0 White paper following employer-trade union agreement reached between Deutsche Bahn (DB) and the Railway & Transport Union in 2016 to regulate limits and use of new technology on its trains and to govern future technological trends affecting train guards. Details of the agreement included close cross-collaboration/consultation with ‘collective bargaining and social partners, works councils and employees’ as crucial to delivering the transition as a success. While also recognising that some jobs (i.e. train guards) may not be required in future, DB guaranteed that they wouldn’t make redundancies until automatic door operations were implemented whereby doing so would give them the time to develop and redeploy guard roles with guarantees of training if they were faced with changing job responsibilities/profile. Therefore, being employee-centric in the digital age offers an additional benefit to fostering a co-operative dimension between the senior management-union dynamic forcing management to tackle difficult questions about workforce engagement regarding new technology implementations with direct accountability also being strengthened (Brown, 2014). It also gives employees confidence that they will be equipped and prepared with adequate training/redeployment for future career moves. However, such case examples are confined to Germany’s coordinated market economy where the labour-capital-state relations are completely different and cross employer-groups collaboration is encouraged to maintain employee skill levels and foster innovation (Ashton, 2004).

The importance behind training and adjustment

While fostering effective support and training for employees seems the most obvious way for people to understand the technology being used, the need to constantly rely on employees to contact IT/technical support whenever needed may not pose a good solution. The reason is that a considerable burden is placed on IT staff resources and they may not be able to respond to constant IT-related issues. Another reason is the potential workplace conflict occurring if workers don’t effectively know how to use the technology leading them to be blamed by managers if tech-related failures and mistakes are made.

Learning how to use technology goes beyond conducting formal training sessions. Group/collaborative training is crucial for enabling a value-added learning process that needs to be encouraged and supported. For instance, some organisations may have their service quality being dependent on tacit knowledge, continuous learning, and information processing fostered by group collaboration. While it is important to spread technical knowledge, it is also crucial for workers to pick up social knowledge as far as knowing where and whom to go for help.

Team stability is another element that poses two opposite outcomes, whereby having stability is crucial to supporting the learning of the new technology and ensuring that knowledge isn’t lost faster than gained due to staff turnover. The other effect is that over-stability can lead to teams being stuck in routines without effectively adapting the learning process. This indicates the need to establish new business routines to go along with the new technology being rolled out. This includes basing new technological routines around the combination of interaction/consultation between employee-business leaders and the level of feedback. It is this where Edmonson et al. (2001) noted in their paper “Disrupted Routines: Team Learning & New Technology Implementation in Hospitals”, when studying hospitals that the failure to properly change routines was the basis behind technological implementation failures. They indicated that successful implementers underwent a team-learning process involving the four steps of enrolment, preparation, trials and reflection. Successful implementers used the enrolment stage to motivate the team, with the preparatory practice sessions being designed and providing the early trials needed to create psychological safety and encourage new behaviours. They also used reflective practices to promote the shared meaning, and drive process improvement across employees where needed.

Changing workforce structure

With technology increasingly becoming a greater influence across most workplaces, businesses need to consider having effective management structures to manage the rapid changes occurring within the business environment. With previous discussion made regarding the benefits of employee autonomy, it seems that employees today have now been able to quickly learn and leverage technology into productive ways of ensuring greater decision-making involvement and freedom. This comes with the importance of encouraging organisations to be ‘informative’ and allow people to engage, form decisions, compliment their skills, and empower them with the technology to ultimately drive overall organisational success. Therefore, with organisations being increasingly reliant on incorporating cutting-edge technology, they may be encouraged to adopt a flatter management structure with a high focus on self-directed teams and employee autonomy. Such suggestions are similarly reflected in how tech start-ups are typically organised. Added supportive research also indicates the effectiveness of technology is itself responsible for driving single manager empowerment when overseeing many employees (Krishnan, 2010).

However, technology potentially undermines/erodes some aspects of the line manager’s role, such as having control of decision-making and specialist knowledge. Previously, the idea of leadership was for managers to be perceived as the ‘experts’ with specialist knowledge/technical expertise and control of their team. With that comes the organisational currency and power aligned with being a manager. With technology eroding, some of those elements within a line manager’s role, further questions remain. What does leadership look like today? Does it come from offering better judgement? However, it does seem that data analytics software today, for instance, offers people the ability to support and form value-driven evidence-based professional judgment where possible. Therefore if we identify people’s value as leaders, then it is without a doubt that technology is responsible for dramatically impacting/facilitating this.

Furthermore, with analytical data software likely to also be leveraged and accessed by employees, this could signal that managers may not be best placed for decision-making. If algorithms are outsourced, or not reliant on human judgment for effective and consistent decision-making, this will further erode management value. However, managers must still understand the purpose of the technologies (systems) and ensure they effectively support employees when they question the decision-making being conducted by the algorithms/AI. Therefore, several issues raised include knowing when is the right time to question the algorithm, machine or system when it comes to overreliance and mistrust. With AI in particular known for being complex, it raises a transparency issue of how easy is it to challenge what the machine/system is doing genuinely? Another issue is who will be made responsible if something goes wrong with the machine? Both issues have implications, especially for managers.

Other potential impacts include the growth of the ‘gig’ economy, where such online companies like Deliveroo and Uber have enabled workers to offer services without an attached employee status. This has significantly grown in recent years, although it is still only a small portion of the labour market, and it is difficult to know when it might become a dominant model.

When aligned with technology, the ‘gig economy’ model takes it to another level when challenging traditional management structures, whereby workers no longer need to rely on a manager to report their work. In other words, they become their own managers, where they also manage and direct their behaviours through following algorithms ‘behind the scenes’ (ACAS, 2013).

Therefore, technology has the opportunity to go to extremes and enable an entire industry based on having thousands of people to do their work without any managers employed at all. This may welcome the opportunity to rely on programmers to write algorithms and control employee behaviours in the coming decades. With organisations increasingly intent on leveraging cutting edge technology to streamline operations, drive efficiency and become strongly competitive, it is a matter of debate as to whether it would be wise to favour algorithms at the expense of the human management and overall traditional concepts of management that have been dominant over these past few decades.

Read The Entire Book

academic research business HR Human Resources Management

Previous Next