Journal of Surgery

Are our Jobs at Stake? How to Survive the AI Revolution?

by Puja G. Khaitan*

Department of Surgery, Division of Thoracic Surgery, Sheikh Shakhbout Medical City, Abu Dhabi, UAE

*Corresponding author: Puja Gaur Khaitan, Department of Surgery, Division of Thoracic Surgery, Khalifa University, Gulf Medical University, Sheikh Shakhbout Medical City, Abu Dhabi, UAE

Received Date: 08 August 2025

Accepted Date: 15 August 2025

Published Date: 18 August 2025

Citation: Khaitan PG (2025) Are our jobs at stake? How to survive the AI revolution?. J Surg 10: 11419 https://doi.org/10.29011/2575-9760.011419

Introduction

Contrary to popular belief, “Artificial Intelligence” (AI) in healthcare is a misnomer. The complexity and nuance of clinical decision-making are anything but artificial. A more fitting term is augmented intelligence-a suite of tools designed not to replace clinicians but to enhance their capabilities. This editorial highlights how such systems are reshaping modern medicine and what clinicians must do to responsibly navigate this transformation.

AI Is Everywhere-Including Medicine

AI is now embedded in nearly every domain of healthcare-from appointment scheduling and teleconsultations to generating differential diagnoses and supporting treatment planning. AI systems assist with administrative tasks like coding, billing, and resource allocation, promising greater efficiency and cost savings for institutions [1]. But these gains bring a critical question: Will clinicians still be needed? The short answer is yes but the long-term outlook is complex. Multiple studies suggest that routine physician and nursing tasks such as ECG interpretation, radiograph review, and medication dispensing-are increasingly vulnerable to automation [2,3].

The Evolving Threat to Healthcare Jobs and Compensation

AI’s growing role in healthcare has raised concerns about job displacement and wage compression. Analyses from labor economists and policy think tanks show that AI tends to impact middle-skill, high-repetition roles first, with radiologists, pathologists, and pharmacists among the most exposed [4]. In contrast, administrative and logistical roles (e.g., hospital laundry, supply chain) are seeing early replacement by AI-enabled robots [5]. Additionally, increased productivity from AI may negatively impact the compensation for clinicians. Institutions and payers may benefit disproportionately, while the cognitive load and responsibility on remaining staff increase [6]. Without proactive policy and professional adaptation, AI could inadvertently undermine physician autonomy and pay, especially for early-career professionals in high-exposure specialties [7].

Automation Bias and Clinical Safety

AI’s greatest clinical value lies in its ability to support repetitive tasks and pattern recognition. Deep learning tools now rival expert physicians in diagnosing pneumonia from chest X-rays, detecting skin cancers, and analyzing retinal images [3,8]. However, over-reliance on such tools-termed automation bias-can lead to diagnostic errors, especially when systems make confident but incorrect recommendations [9].This risk is amplified by the lack of transparency in many proprietary models. Without explainable algorithms and human oversight, clinicians may inadvertently defer judgment to a black box-undermining both safety and accountability.

The Role of Clinicians in an AI-Augmented Era

Physicians must continue to perform fundamental tasks-interpreting imaging, critically evaluating data, and synthesizing evolving literature. AI should complement, not replace, this clinical reasoning; at least for the time being. Healthcare is not only data-driven but also emotionally and ethically complex, requiring sensitivity to patient values, cultural context, and shared decision-making. An element that no AI tool can replace in medicine to be able to read social cues and modify accordingly and change treatment plans and adapt. In specialties like surgery, where robotic assistance is already established, AI is expanding its role. Intelligent surgical staplers, image-guided dissection tools, and preoperative modeling are becoming more autonomous. Yet, until clear legal frameworks are established, accountability for adverse outcomes still lies with the surgeon, not the software [10].

The Infrastructure Gap: Interoperability and AI Waste

A major challenge to effective AI integration and developing tools that can be generalizable is the lack of interoperable Electronic Health Records (EHRs). Without standardized data formats, AI systems built in one hospital may be useless in another. This siloed development leads to redundant tools, low-quality outputs, and wasted cloud storage-termed “AI waste.” Recent partnerships, such as Epic’s integration with OpenAI’s GPT-4 via Microsoft Azure, represent steps toward scalable infrastructure. Still, global adoption of universal, structured EHR systems is necessary to ensure model generalizability and meaningful clinical insight [1]. And currently, we have a considerable distance to achieve that moment.

Conclusion: Embrace with Vigilance

AI in healthcare is here to stay. But it must be used responsibly, with rigorous validation, explainability, and real-world oversight. Clinicians must: remain clinically sharp-interpreting data firsthand, demand transparency in algorithms, and engage with AI research and policy development. The future of medicine will be shaped not by algorithms alone, but by how clinicians choose to guide, govern, and grow with them.

References

  1. Davenport T, Kalakota R (2019) The potential for AI in healthcare. Future Healthc J 6: 94-98.
  2. Hannun AY (2019) Cardiologist-level arrhythmia detection with deep neural networks. Nat Med 25: 65-69.
  3. McKinney SM (2020) International evaluation of an AI system for breast cancer screening. Nature 577: 89-94.
  4. Acemoglu D, Restrepo P (2022) Artificial intelligence, automation, and work. J Econ Perspect 36: 3-20.
  5. Sharma A (2024) AI applications in hospital logistics: a systems review. Am J Data Min Knowl Discov.
  6. Kellogg Insight (2023) Will AI boost your salary-or take your job? Kellogg School of Management 2023.
  7. Ozgul S (2024) AI automation and labor market inequality. Brookings Institute 2024.
  8. Esteva A (2017) Dermatologist-level classification of skin cancer with deep neural networks. Nature 542:115-118.
  9. Goddard K (2012) Automation bias: a systematic review. BMJ Qual Saf 21: 711-717.
  10. Hashimoto DA (2018) Artificial intelligence in surgery: promises and perils. Ann Surg 268: 70-76.

© by the Authors & Gavin Publishers. This is an Open Access Journal Article Published Under Attribution-Share Alike CC BY-SA: Creative Commons Attribution-Share Alike 4.0 International License. Read More About Open Access Policy.

Update cookies preferences