Christodoulidis G, Tsagkidou K, Bartzi D, Prisacariu IA, Agko ES, Koumarelas KE, Zacharoulis D. Sarcopenia and frailty: An in-depth analysis of the pathophysiology and effect on liver transplant candidates. World J Hepatol 2025; 17(5): 106182 [DOI: 10.4254/wjh.v17.i5.106182]
Corresponding Author of This Article
Grigorios Christodoulidis, MD, PhD, Department of General Surgery, University Hospital of Larissa, Mezourlo, Larissa 41110, Thessalia, Greece. gregsurg@yahoo.gr
Research Domain of This Article
Oncology
Article-Type of This Article
Minireviews
Open-Access Policy of This Article
This article is an open-access article which was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
Co-first authors: Grigorios Christodoulidis and Kyriaki Tsagkidou.
Author contributions: Christodoulidis G designed the overall concept and outline of the manuscript; Christodoulidis G and Tsagkidou K contributed equally to this article, they are the co-first authors of this manuscript; Christodoulidis G, Tsagkidou K, Bartzi D, Prisacariu IA, Agko ES, Koumarelas KE, and Zacharoulis D contributed to the discussion and design of the manuscript, writing, editing the manuscript, and review of literature; and all authors thoroughly reviewed and endorsed the final manuscript.
Conflict-of-interest statement: All the authors report no relevant conflicts of interest for this article.
Open Access: This article is an open-access article that was selected by an in-house editor and fully peer-reviewed by external reviewers. It is distributed in accordance with the Creative Commons Attribution NonCommercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: https://creativecommons.org/Licenses/by-nc/4.0/
Corresponding author: Grigorios Christodoulidis, MD, PhD, Department of General Surgery, University Hospital of Larissa, Mezourlo, Larissa 41110, Thessalia, Greece. gregsurg@yahoo.gr
Received: February 18, 2025 Revised: April 9, 2025 Accepted: May 7, 2025 Published online: May 27, 2025 Processing time: 98 Days and 11.1 Hours
Abstract
Cirrhosis represents the end stage of chronic liver disease, significantly reducing life expectancy as it progresses from a compensated to a decompensated state, leading to serious complications. Recent improvements in medical treatment have created a shift in cirrhosis management. Various causes, including hepatitis viruses, alcohol consumption, and fatty liver disease, contribute to cirrhosis and are closely linked to liver cancer. The disease develops through hepatocyte necrosis and regeneration, resulting in fibrosis and sinusoidal capillarization, leading to portal hypertension and complications such as ascites, hepatic encephalopathy, and organ dysfunction. Cirrhosis also holds an increased risk of hepatocellular carcinoma. Diagnosing cirrhosis involves assessing fibrosis scores through blood tests and measuring liver stiffness through elastography. Liver transplantation is the definitive treatment for end-stage liver disease and acute liver failure.
Core Tip: Cirrhosis is a term used to describe a complex pathophysiological condition, which causes diffuse damage to the liver. These conditions lead to overall hepatocyte degeneration and necrosis, excessive fibrosis and the development of regenerative nodules as fibrous tissue encases surviving hepatocytes. The term sarcopenia, was initially introduced by Rosenberg in 1989, initially referred only to the loss of muscle mass, but later, the quality of muscle mass was also included. Frailty is a common condition in patients with cirrhosis, and it is associated with increased morbidity and mortality. Sarcopenia and frailty are distinct features that share common pathophysiologic mechanisms, both affecting the course of action in candidates in pre- and post-liver transplantation.
Citation: Christodoulidis G, Tsagkidou K, Bartzi D, Prisacariu IA, Agko ES, Koumarelas KE, Zacharoulis D. Sarcopenia and frailty: An in-depth analysis of the pathophysiology and effect on liver transplant candidates. World J Hepatol 2025; 17(5): 106182
Hepatocellular carcinoma (HCC) is the most common primary liver malignancy, accounting for over 80% of primary liver cancers, while intrahepatic cholangiocarcinoma arises from the bile ducts. HCC is strongly associated with liver cirrhosis (LC) but can rarely develop in a healthy liver. Its incidence has increased globally, particularly in Western countries, due to the rising prevalence of non-alcoholic fatty liver disease (NAFLD). Liver cancer ranks as the fifth most common cancer and the fourth leading cause of cancer-related deaths worldwide, with men being at significantly higher risk (incidence ratio of 2.8:1). Prognosis remains poor, with a five-year survival rate of 19.6%, which drops drastically to 2.5% in metastatic cases. The development of HCC is driven by various molecular abnormalities, including cell cycle deregulation, DNA methylation alterations, chromosomal instability, immunomodulation, epithelial-to-mesenchymal transition, increased HCC stem cells, and microRNA dysregulation. Given its aggressive nature and high mortality rate, ongoing research focuses on early detection, targeted therapies, and optimizing treatment strategies, including liver transplantation (LT) and systemic therapies[1].
Sarcopenia, whether measured independently or as part of physical frailty, is a significant predictor of adverse clinical outcomes before LT, including poor quality of life (QoL), hepatic decompensation (such as HE), infections, and mortality. Additionally, sarcopenia predicts poor outcomes after LT, with affected patients experiencing longer hospital stays, a higher need for mechanical ventilation and intensive care, increased infection rates, greater healthcare costs, and higher 1-year mortality post-transplant. The choice of graft for LT depends on a preoperative assessment that includes factors such as age, disease severity, comorbidities, and surgical history, and plays a critical role in ensuring successful outcomes. A United Kingdom-based study of 232 consecutive LT patients demonstrated that those receiving marginal grafts (i.e., those with steatosis or prolonged cold ischemia times) and exhibiting malnutrition or muscle mass depletion [assessed by computed tomography (CT) imaging] had significantly higher rates of infection and longer intensive care unit (ICU)/hospital stays, irrespective of their baseline model for end-stage liver disease (MELD) score. While further validation is needed, this study underscores the importance of careful graft selection in sarcopenic patients or the need for reevaluation of graft choice following a period of nutritional optimization[2].
ETHNIC VARIATIONS IN SARCOPENIA PREVALENCE
Studies have demonstrated significant disparities in sarcopenia prevalence among various ethnic groups. For instance, research conducted in West China revealed the following prevalence rates among adults aged 50 years and older: Han (22.3%), Tibetan (18.2%), Qiang (11.8%), Yi (34.7%), and Hui (26.7%). These variations may be attributed to genetic factors, lifestyle differences, and environmental influences unique to each ethnic group.
Further investigation in Xining, China, compared sarcopenia prevalence between Han and minority ethnic groups using multiple diagnostic criteria. The findings indicated that Han males had a higher prevalence of sarcopenia compared to their minority counterparts when assessed with the Beijing criteria, while no significant ethnic differences were observed using the Asian Working Group for Sarcopenia (AWGS) 2019 or Lasha criteria. This suggests that the choice of diagnostic criteria can influence the detection of ethnic disparities in sarcopenia prevalence.
REGIONAL DIFFERENCES IN DIAGNOSTIC CRITERIA
The application of various diagnostic criteria across regions further complicates the assessment of sarcopenia. In West China, a study comparing six diagnostic criteria - including AWGS 2019, AWGS 2014, European Working Group on Sarcopenia in Older People (EWGSOP)1, EWGSOP2, International Working Group on Sarcopenia, and Foundation for the National Institutes of Health - found prevalence rates ranging from 11.8% to 57.1% among adults aged 50 years and older. Such discrepancies highlight the impact of regional diagnostic preferences and the need for standardized, population-specific criteria. Moreover, a systematic review focusing on Asian populations reported a sarcopenia prevalence of 16.5% among community-dwelling older adults, with variations depending on the diagnostic methods employed. This emphasizes the necessity for regionally adapted diagnostic frameworks that consider local demographic and clinical characteristics[3].
Sarcopenia was initially introduced by Rosenberg in 1989 and was initially considered to be an age-related disorder. However, in the later years, in 2018, the EWGSOP2, updated the initial definition, aiming to a more detailed and accurate approach. To this day, sarcopenia is characterized as primary or age-related when there is no other cause attributed to it. It is characterized as secondary, when factors other than aging, lead to loss of muscle mass, such as systemic diseases involving inflammatory processes, diseases, leading to immobility and the overall lifestyle[4]. Frailty, originally defined as a biologic syndrome of reduced physiologic reserve and increased vulnerability, arises from multidimensional dysfunction across systems such as musculoskeletal, cardiovascular, and immune systems, while in decompensated cirrhosis, hepatic-specific factors (e.g., protein synthetic dysfunction, ammonia toxicity, and encephalopathy-related inactivity) predominantly drive physical frailty, with assessments focusing on muscle function loss[5]. Cachexia, malnutrition, and sarcopenia are distinct conditions with overlapping pathophysiological mechanisms, with cachexia serving as the underlying pathophysiological-logical basis for secondary sarcopenia[6]. Sarcopenia is highly prevalent among patients with cirrhosis; however, its impact on post-LT outcomes remains inconclusive[7]. LT however, enhances the prognosis of decompensated cirrhosis as the MELD score increases[8]. Question arise on whether LT can improve the outcome of sarcopenia on the post-LT patients.
SEARCH METHOD
PubMed, Cochrane Library, MEDLINE Scopus-clinical trial register, and Web of Science databases were initially searched by the authors to retrieve studies reporting data on LC, Sarcopenia, Frailty, Assessment of sarcopenia and frailty in LT, artificial intelligence (AI) in hepatology from 2019 until present day. The following medical subject heading terms alone or matched by the logical operators “OR” or “AND” were used: “Liver Cirrhosis”, “Sarcopenia”, “Frailty”, “Liver Cancer”, “Hepatocellular carcinoma”, “Muscle mass loss”, “Chronic Liver Disease”, “Fibrosis”, “Liver Transplant”, “Guidelines”, “Complications”, “Malnutrition”, “Role of transplant”, “Artificial Intelligence”, “Machine Learning”, “Deep Learning”, “Muscle Mass Assessment”, “Muscle Wasting”, “Computed tomography”, “DXA”. Old, repetitive, and non-English studies were excluded. After an initial title screening, each relevant article was subsequently reviewed, and 73 representative scientific papers were finally selected.
PATHOPHYSIOLOGICAL MECHANISMS IN SARCOPENIA AND FRAILTY
Cirrhosis is a progressive liver disease marked by hepatocyte degeneration, necrosis, and fibrosis, leading to the formation of regenerative nodules that impair liver function. Sarcopenia, a prevalent complication affecting 30%-70% of cirrhotic patients, worsens as the disease advances. It exacerbates liver dysfunction, particularly in decompensated cirrhosis, where complications such as ascites, hepatic encephalopathy (HE), and HCC further reduce nutritional intake, perpetuating a cycle of muscle wasting. Factors like diminished appetite, impaired gastrointestinal motility, and altered taste perception contribute to a negative energy balance, activating catabolic pathways that accelerate muscle breakdown.
Reduced physical activity, especially in cirrhotic patients awaiting LT, suppresses muscle protein synthesis (MPS) by downregulating the mammalian target of rapamycin (mTOR) pathway. In contrast, resistance exercise enhances insulin-like growth factor 1 levels, promoting muscle growth. Cirrhosis-induced hypermetabolism intensifies muscle catabolism through increased proteolysis and lipid breakdown, leading to significant muscle depletion. Additionally, disrupted gluconeogenesis and reduced hepatic glycogen stores accelerate the loss of branched-chain amino acids, further impairing muscle metabolism. Elevated myostatin levels exacerbate muscle degradation by inducing oxidative stress and upregulating muscle-specific E3 ligases, atrogin-1 and muscle-specific ring finger protein 1.
Hyperammonemia is closely linked to sarcopenia, as excessive ammonia levels activate inhibitor of nuclear factor kappa-B kinase, triggering nuclear factor kappa-B activation and increased myostatin expression. This disrupts MPS, induces oxidative stress, and accelerates muscle loss. Studies have shown a strong correlation between elevated ammonia levels in both blood and muscle and increased myostatin expression, highlighting the need for effective hyperammonemia management in cirrhotic patients[9]. Additionally, chronic inflammation further exacerbates muscle degradation by increasing pro-inflammatory cytokines such as tumor necrosis factor-alpha and interleukin-6, which activate the ubiquitin-proteasome pathway and drive proteolysis. Mitochondrial dysfunction, oxidative stress, and systemic inflammation collectively contribute to sarcopenia progression. Sarcopenia worsens cirrhosis outcomes, increasing mortality, infection risk, and post-transplant complications. It is an independent risk factor for the progression of NAFLD and non-alcoholic steatohepatitis, which accelerate liver fibrosis. Early identification and monitoring of sarcopenia are essential for improving clinical outcomes[9].
Insulin and glucagon dysregulation - insulin plays a critical anabolic role in skeletal muscle by inhibiting proteolysis and stimulating protein synthesis through the phosphatidylinositol 3-kinase (PI3K)/ protein kinase B (Akt)/mTOR signaling pathway, particularly in the presence of essential amino acids. However, in cirrhosis, the anabolic response to insulin is blunted due to impaired substrate availability and postprandial nutrient handling. Insulin resistance, a common feature of chronic liver disease (CLD) regardless of etiology, arises from decreased hepatic clearance, portosystemic shunting, and compensatory pancreatic hypersecretion. Notably, it may occur in the absence of overt hyperglycemia, a phenomenon termed euglycemic insulin resistance. The presence of insulin resistance is increasingly recognized as a key contributor to sarcopenia in cirrhosis. Impaired activation of the PI3K/Akt/mTOR pathway diminishes MPS while promoting proteolytic pathways, such as the ubiquitin-proteasome system (UPS), resulting in accelerated muscle breakdown. This metabolic shift contributes to progressive loss of skeletal muscle mass (SMM) and function, a hallmark of sarcopenia, and is associated with increased morbidity and mortality in patients with advanced liver disease. Furthermore, insulin resistance disrupts mitochondrial function and impairs muscle regeneration, further exacerbating muscle atrophy. Hyperglucagonemia, another metabolic hallmark of cirrhosis, arises due to reduced insulin-mediated suppression of glucagon secretion and decreased hepatic glucagon receptor activity owing to hepatocellular loss. The resulting glucagon resistance limits hepatic glycogenolysis and gluconeogenesis, leading to compensatory muscle catabolism as the body mobilizes amino acids from skeletal muscle to sustain hepatic glucose production. This catabolic state reinforces the vicious cycle of sarcopenia in cirrhotic patients. Together, insulin resistance and glucagon dysregulation create a metabolic environment that favors muscle wasting[9].
SARCOPENIA AND SKELETAL MUSCLE LOSS
The balance between MPS and muscle protein breakdown (MPB) regulates skeletal muscle protein turnover. This process is influenced by factors such as nutrition, physical activity, and molecular pathways, including mTORC1, satellite cells, and the UPS. Muscle growth occurs when MPS surpasses MPB, whereas an increase in MPB leads to muscle atrophy. Myostatin plays a role in muscle degradation by inhibiting MPS and stimulating MPB through the suppression of the insulin-like growth factor 1/PI3K/Akt/mTORC1 pathway and the activation of the UPS. In CLD, dysregulated muscle protein turnover contributes to sarcopenia. Research indicates that MPS is often diminished in cirrhosis, but findings on MPB remain inconsistent due to variations in study methodologies. Additionally, cirrhotic patients exhibit anabolic resistance, similar to age-related sarcopenia, which weakens the MPS response to amino acids and exercise. Despite concerns regarding coagulation abnormalities and thrombocytopenia, studies have shown that muscle biopsies are safe in patients with Child-Pugh class A cirrhosis, allowing for more in-depth research into muscle wasting mechanisms[10].
DYSBIOSIS IN LC AND SARCOPENIA
The gut microbiota (GM) plays a fundamental role in immune regulation, maintaining intestinal barrier function, and supporting systemic health. In cirrhosis, an imbalance in GM composition - referred to as dysbiosis - leads to increased intestinal permeability and bacterial translocation, contributing to inflammation and liver damage in conditions such as NAFLD and LC. Additionally, portal hypertension exacerbates bacterial translocation, heightening the risk of complications like spontaneous bacterial peritonitis, HE, sarcopenia, and HCC. Inflammatory and oxidative stress responses triggered by dysbiosis exacerbate muscle loss by increasing endotoxemia, impairing anabolic signaling, and worsening insulin resistance. Dysfunction of the gut-liver-muscle axis is believed to drive sarcopenia through mechanisms such as systemic inflammation, mitochondrial dysfunction, and enhanced catabolic activity. The cirrhosis dysbiosis ratio, which quantifies the balance between beneficial and harmful bacteria, declines as liver disease progresses and is linked to an increased risk of acute-on-chronic liver failure (ACLF) and mortality.
The GM also influences muscle health by modulating bile acid metabolism through the Takeda G protein-coupled receptor 5 receptor, which impacts energy expenditure and muscle function. In cirrhosis, disrupted bile acid signaling may contribute to sarcopenia by impairing lipid metabolism and mitochondrial activity. Additionally, gut-derived endotoxins stimulate pro-inflammatory cytokines such as tumor necrosis factor-alpha and interleukin-6, exacerbating muscle breakdown[11]. Addressing dysbiosis presents a potential therapeutic strategy for managing liver disease progression and sarcopenia. The non-absorbable antibiotic rifaximin has been shown to lower ammonia levels and endotoxemia in cirrhosis, though its direct effects on muscle health remain uncertain. Probiotics may help restore GM balance and enhance intestinal barrier function, potentially reducing the severity of HE and hospitalization rates, though their role in sarcopenia prevention requires further investigation. Exercise has been found to increase GM diversity and promote the production of short-chain fatty acids, which benefit metabolic health and may reduce hepatic venous pressure gradient. While studies in athletes and NAFLD patients suggest that exercise positively impacts GM, more research is needed to assess its effectiveness in cirrhotic patients with sarcopenia[11].
SARCOPENIA AND POST-TRANSPLANT OUTCOMES
Sarcopenia prior to LT is associated with nearly twice the risk of post-transplant mortality, as shown in studies using psoas muscle area and skeletal muscle index (SMI) assessments. A 2016 meta-analysis of 19 cohorts involving 3803 patients found that sarcopenia significantly increased post-LT mortality risk [hazard ratio (HR) = 1.84, 95% confidence interval (CI): 1.11-3.05]. The mortality risk was 1.8 times higher in Europe and North America and exceeded twofold in Asia.
Sarcopenic patients are more susceptible to bacterial and fungal infections following LT, likely due to the immunomodulatory role of skeletal muscle. They also tend to require longer ICU stays, increased mechanical ventilation support, and higher fresh frozen plasma transfusions, potentially due to myosin’s role in coagulation. Sarcopenic patients have been found to have higher Clavien-Dindo scores, indicating more severe postoperative complications. While cross-sectional imaging [CT, magnetic resonance imaging (MRI)] remains the preferred method for diagnosing sarcopenia, frailty assessments include the liver frailty index (LFI), six-minute walk test, and grip strength measurements[12].
INTERVENTIONS FOR SARCOPENIA IN CIRRHOSIS
Sarcopenia management in cirrhosis involves a combination of nutritional support, structured exercise programs, multimodal therapies, and pharmacological interventions such as l-carnitine and rifaximin. Physical activity improves muscle strength and overall QoL, though its implementation in Child-Pugh C patients remains limited due to safety concerns, particularly in those with varices. Zinc deficiency, which is frequently observed in LC, is linked to sarcopenia and poor prognosis. A study reported that 27.2% of patients with low zinc levels had sarcopenia.
Emerging therapies, including myostatin inhibitors, vitamin D supplementation, and testosterone therapy, show potential but require further research. Exercise interventions have been beneficial in Child-Pugh A patients, enhancing muscle mass, function, and reducing portal hypertension. A multicenter study involving 181 hospitalized HCC patients found that exercise independently improved the LFI [odds ratio (OR) = 2.38, P = 0.0091]. However, further studies are needed to determine the most effective exercise regimen and its long-term impact on LC and HCC patients[13]. Sarcopenia, characterized by the progressive loss of SMM and function, poses significant health risks, particularly among individuals with CLD awaiting LT. Effective management necessitates stratified intervention strategies tailored to the severity of the condition. Clinical practice guidelines, such as those from the International Conference on Sarcopenia and Frailty Research and the Academy of Medicine, Singapore, offer comprehensive recommendations for screening, diagnosis, and management of sarcopenia[14].
SCREENING AND DIAGNOSIS
Early identification of sarcopenia is crucial. The International Conference on Sarcopenia and Frailty Research guidelines advocate for routine screening in at-risk populations using tools like the strength, assistance with walking, rising from a chair, climbing stairs, and falls strength, assistance with walking, rising from a chair, climbing stairs, and falls (SARC-F) questionnaire, which assesses strength, assistance in walking, rise from a chair, climb stairs, and falls. Confirmatory diagnosis should incorporate objective measures of muscle strength (e.g., handgrip dynamometry), muscle mass [e.g., dual-energy X-ray absorptiometry (DXA) or bioelectrical impedance analysis (BIA)], and physical performance (e.g., gait speed).
STRATIFIED INTERVENTION STRATEGIES
Intervention strategies should be individualized based on the severity of sarcopenia.
Mild sarcopenia (presarcopenia)
Exercise interventions: Initiate resistance-based training programs to enhance muscle strength and mass. Progressive resistance exercises performed 2-3 times per week have shown efficacy in improving muscle function. Nutritional support: Ensure adequate protein intake (1.0-1.2 g/kg body weight/day) to support MPS. Supplementation with essential amino acids, particularly leucine, may be beneficial.
Moderate sarcopenia
Comprehensive lifestyle interventions: Combine resistance exercise with aerobic activities to improve overall physical performance. Targeted nutritional supplementation: Consider β-hydroxy β-methylbutyrate supplementation to promote muscle mass accretion and function. Regular monitoring: Schedule periodic assessments to monitor progression and adjust interventions accordingly.
Severe sarcopenia
Multidisciplinary approach: Engage a team comprising hepatologists, physiotherapists, dietitians, and occupational therapists to develop individualized care plans. Physical therapy: Implement supervised, tailored exercise programs focusing on both resistance and balance training to reduce fall risk and enhance functional independence. Nutritional interventions: Address potential malnutrition with energy-dense, protein-rich diets, and consider supplementation with omega-3 fatty acids to mitigate inflammation-related muscle degradation. Pharmacologic considerations: While no medications are currently approved specifically for sarcopenia, ongoing clinical trials are exploring agents such as selective androgen receptor modulators and myostatin inhibitors. These should be considered within clinical trial settings or as part of emerging therapeutic protocols.
IMPLEMENTATION IN LT CANDIDATES
For patients with CLD awaiting LT, addressing sarcopenia is vital to optimize transplant outcomes. Prehabilitation programs incorporating tailored exercise and nutritional interventions can enhance physical resilience, potentially reducing postoperative complications and improving recovery trajectories. Regular assessments using standardized tools can aid in monitoring intervention efficacy and guiding adjustments to care plans. In conclusion, stratified intervention strategies for sarcopenia, grounded in current clinical practice guidelines, are essential for improving patient outcomes, particularly in vulnerable populations such as those awaiting LT. Personalized approaches that combine exercise, nutrition, and emerging pharmacological therapies hold promise in mitigating the adverse impacts of sarcopenia[14].
MALNUTRITION
Malnutrition, often accompanying sarcopenia, adversely affects LT outcomes. A study of 162 male LT recipients identified chronic hepatitis C (29.3%) and alcoholic liver disease (23.7%) as the primary transplant indications. Sarcopenia was assessed using the L3-psoas muscle index (L3-PMI), with cutoffs of 340 mm2/m2 for men and 264 mm2/m2 for women. Severe malnutrition correlated with lower L3-PMI, younger age, and male sex. One year post-transplant, 93% of patients survived, with nutritional status improving within 6-9 months. However, nutritional interventions did not significantly impact LT waiting-list patients[15].
Both sarcopenia and malnutrition independently predicted poorer post-LT outcomes, including extended ICU stays (> 5 days in 20.7%), prolonged mechanical ventilation (> 24 hours in 36.2%), and increased infection rates (32.8%). Severe malnutrition and high MELD scores were strong independent predictors of infection risk. Patients with low L3-PMI exhibited significantly higher 12-month mortality (6.9%), longer ICU stays (55.5 vs 2 days, P = 0.034), extended hospitalization (151 vs 22.5 days, P = 0.036), and increased infection rates (100% vs 12.5%, P = 0.016). Since MELD scores alone do not fully capture nutritional deficiencies, additional assessment tools are necessary[15].
FUTURE DIRECTIONS IN SARCOPENIA MANAGEMENT
The future of sarcopenia management emphasizes early detection and intervention, with prevention playing a key role. Primary sarcopenia is largely age-related, whereas secondary sarcopenia is influenced by conditions such as diabetes and chronic lung disease. Research suggests that muscle strength in early adulthood (20-40 years) may predict sarcopenia risk later in life, with grip strength below 2.5 standard deviations of young adult levels serving as a diagnostic criterion, similar to osteoporosis. Exercise remains the most significant modifiable factor in sarcopenia prevention and treatment, underscoring the importance of lifelong physical activity. Although the role of nutrition is still being explored, adequate protein and nutrient intake is essential. Future studies should focus on refining diagnostic criteria, identifying biomarkers for early detection, and developing targeted therapies to address sarcopenia's cellular and molecular mechanisms[16].
EPIGENETICS
The development of drug resistance in HCC involves various epigenetic, non-coding RNA, microRNA, and signaling pathway alterations. DNA methylation plays a critical role, with microrchidia forming a complex with DNA-methyltransferase-3 on the promoters of neurofibromatosis type 2 and kidney and brain protein to cause DNA hypermethylation. Additionally, protein arginine methyltransferase 6 methylates c-Raf proto-oncogene, inhibiting fraser1/Raf proto-oncogene binding and reducing sorafenib resistance. Epigenetic silencing of B-cell lymphoma-2/adenovirus early region 1B 19 kilodalton protein interacting protein 3 is linked to sorafenib resistance, but its restoration can overcome this resistance. Long non-coding RNAs, such as H19, small nucleolar RNA host gene (SNHG) 1, SNHG3, and SNHG16, are upregulated in sorafenib-resistant cells, contributing to resistance through various mechanisms including activation of the Akt pathway and epithelial-mesenchymal transition (EMT) induction. MicroRNAs like miR-19a-3p, miR-181a, miR-221, and miR-494 also play pivotal roles in sorafenib resistance by modulating key signaling pathways such as phosphatase and tensin homolog/Akt and Ras association domain family member 1. Furthermore, genes involved in EMT, like Pin1, and cell cycle regulation genes, including forkhead box M1, are upregulated in regorafenib-resistant HCC. Other factors such as topoisomerase II alpha expression, transforming growth factor signaling, and the activation of glucose metabolism are also associated with resistance. In the context of lenvatinib resistance, non-coding RNAs like lncXIST, AC026401.3, circPAK1, and circMED27 contribute to drug resistance through mechanisms like activation of various signaling pathways [enhancer of zeste homolog 2/ nucleotide - binding oligomerization domain containing 2/Extracellular - signal - regulated kinase (ERK), Akt/ERK] and ubiquitin specific peptidase 28 upregulation. The increased expression of transporters such as breast cancer resistance protein and adenosine triphosphate - binding cassette sub - family B member 1, along with hypoxia-related factors like hypoxia-inductible factor 1and neuropilin-1, further enhances resistance. Moreover, the activation of mitogen-activated protein kinase/mitogen-activated extracellular signal-regulated kinase/ERK signaling and EMT markers, along with the role of ERK1 and endothelial growth factor receptor 2, are critical in the development of resistance to both lenvatinib and regorafenib[17]. Monotherapy and combination therapies have been extensively studied for the treatment of advanced HCC. Several clinical trials have explored the use of immune checkpoint inhibitors such as nivolumab and pembrolizumab. In the 2017 CheckMate 040 trial (NCT01658878), nivolumab showed promising results, with an objective response rate (ORR) of 15% in the dose-escalation phase and 20% in the dose-expansion phase. Pembrolizumab also demonstrated efficacy in the 2018 KEYNOTE 224 trial (NCT02702414), with an ORR of 17%. The 2019/2022 CheckMate 459 (NCT02576509) trial comparing nivolumab to sorafenib showed a higher overall survival (OS) in the nivolumab group, while the KEYNOTE 240 trial (2019) found pembrolizumab significantly improved OS and progression-free survival (PFS) in the second-line setting. The 2022 KEYNOTE 394 trial (NCT03062358) also demonstrated the superiority of pembrolizumab combined with best supportive care over placebo. In combination therapy, the CheckMate 040 trial (2020) involving nivolumab plus ipilimumab (programmed death - 1 + cytotoxic T - lymphocyte - associated protein 4) showed an ORR of 32% and an OS of 22.8 months, while the CheckMate 9DW trial (ongoing) is evaluating nivolumab + ipilimumab vs sorafenib or lenvatinib. The 2021 HIMALAYA STRIDE trial (NCT03298451) comparing tremelimumab + durvalumab to sorafenib also showed promising results with an OS of 16.43 months. Combining immune checkpoint inhibitors with vascular endothelial growth factor inhibitors has also been tested, with the 2020 IMbrave150 trial (NCT03434379) showing a significant improvement in OS (19.2 months vs 13.4 months) for atezolizumab + bevacizumab over sorafenib. Additionally, the LEAP-002 trial (2022) combining pembrolizumab with lenvatinib demonstrated better outcomes in OS and PFS compared to lenvatinib alone. Other combinations such as camrelizumab + apatinib have been studied in the 2020 Rescue trial and the 2022 SHR-1210 trial, both showing improved OS and PFS in the first-line setting. Finally, the COSMIC-312 trial (2021) evaluated cabozantinib + atezolizumab, showing positive results in OS and PFS compared to sorafenib. These trials collectively highlight the evolving landscape of immunotherapy and combination strategies in advanced HCC[18].
A study conducted by Habl et al[19], on allograft tolerance in living donor LT (LDLT) recipients found that 74.7% of recipients achieved spontaneous allograft tolerance after a minimum of 5 years, with 4.7% showing operational tolerance and 95.3% showing proper tolerance. Factors associated with failed tolerance development included higher pre-LT MELD score, acute cellular rejection (ACR) post-LT, post-LT viral hepatitis, and biliary complications. Older recipients were more likely to develop tolerance, while recurrent ACR and viral hepatitis were linked to a higher incidence of non-tolerance.
Compared to other studies where 20%-40% of recipients achieve graft tolerance, this study demonstrated a higher rate, possibly due to differences in demographics, liver pathology, immunosuppression protocols, and postoperative complications. Allograft tolerance, whether therapeutic or spontaneous, remains poorly understood, and the study suggests that immunosuppressive drug withdrawal and liver function stability are potential markers for defining tolerance. The study also identified that recipients with lower immunosuppressive drug levels, particularly everolimus and cyclosporine, had a higher likelihood of developing tolerance. Cyclosporine is known to enhance regulatory T cell function and reduce the risk of rejection, while everolimus inhibits antigen-presenting cell maturation and promotes tolerance. Conversely, a higher percentage of patients receiving mycophenolate mofetil were in the non-tolerant group, as mycophenolate mofetil is often used to treat rejection.
Non-tolerant recipients had higher serum creatinine levels, possibly due to nephrotoxicity from higher tacrolimus doses, and higher gamma-glutamyl transferase and bilirubin levels, suggesting cholestasis-related rejection. Biliary complications were significantly more common in the non-tolerant group, with higher incidences of recurrent viral hepatitis and ACR also associated with lower tolerance. Despite some study limitations, including its retrospective design and small sample size, the study suggests that factors such as MELD score, ACR frequency, post-LT viral hepatitis, and biliary complications can help predict allograft tolerance. Future multicenter studies are needed to confirm these findings[19].
ASSESSMENT OF SARCOPENIA AND FRAILTY
In clinical practice, the identification and evaluation of sarcopenia patients can be a challenging process. The EWGSOP2 recommends the SARC-F questionnaire as a simple, cost-effective tool for screening sarcopenia risk in clinical and community healthcare settings. This self-reported, five-item questionnaire assesses patient-perceived limitations in strength, walking ability, chair rise, stair climbing, and history of falls. While SARC-F has low-to-moderate sensitivity, it demonstrates high specificity, primarily identifying severe sarcopenia cases, making it a practical method to initiate sarcopenia assessment and management in clinical practice[4]. Tables 1 and 2 summarize the available methods of assessing sarcopenia and frailty, respectively.
Parameters affecting the measurement of sarcopenia
Based on its definition, sarcopenia is measured by both the muscle mass quantity and quality. Muscle strength, quantity, and physical performance are critical components for evaluating sarcopenia:
Muscle strength: Grip strength, measured using a handheld dynamometer (e.g., Jamar dynamometer), is a simple, cost-effective, and reliable surrogate for overall muscle strength, with low grip strength strongly predicting adverse outcomes such as disability and mortality. When hand strength cannot be assessed, alternatives like the chair stand test or isometric torque methods for lower limbs may be used.
Muscle quantity: Gold-standard imaging modalities like MRI and CT accurately assess muscle mass but are costly and impractical for routine use. DXA is more accessible and provides consistent results within specific settings, while portable, affordable BIA equipment offers a practical alternative, despite requiring calibration for population-specific equations. Adjustments for body size (e.g., height, body mass index) are necessary for accurate interpretation of muscle mass metrics.
Physical performance: Tests such as gait speed (≤ 0.8 m/second indicating severe sarcopenia), the short physical performance battery (SPPB), the timed-up and go test, and the 400-m walk test are commonly used. Gait speed is particularly practical and predictive of sarcopenia outcomes like disability and mortality, making it the preferred tool for clinical practice, while SPPB and 400-m walk tests are more frequently employed in research due to time and space requirements. These methods offer a comprehensive framework for sarcopenia assessment, balancing practicality, cost, and predictive value[4].
Assessment of frailty
Frailty, sarcopenia, and malnutrition are critical factors influencing outcomes in patients with end-stage liver disease (ESLD) and those undergoing LT. Frailty is independently associated with increased waitlist (WL) mortality, hospitalization rates, and reduced QoL, making its early identification and management vital. Tools like the fried frailty index, clinical frailty scale, and LFI are commonly used for frailty assessment. The fried frailty index evaluates five physical domains and predicts morbidity and mortality, though its accuracy may be limited in patients with HE. The clinical frailty scale offers a quick, subjective evaluation of functionality and dependency but lacks precision in tracking changes after interventions. The LFI, a performance-based tool tailored for hepatic patients, assesses grip strength, chair stands, and balance, providing comprehensive insights into frailty-related factors like muscle wasting and malnutrition. It is particularly effective at predicting WL mortality, rehospitalization risks, and post-transplant outcomes, outperforming MELD-sodium (MELD-Na) scores in risk stratification. Regular reassessments are essential, with annual evaluations for compensated cirrhosis and more frequent intervals for decompensated cases. Frailty often worsens post-transplant but may improve modestly within a year, emphasizing the need for pre-transplant interventions to optimize outcomes. Incorporating tools like the LFI into clinical assessments enhances decision-making, facilitates timely interventions, and improves patient management across care settings[20,21].
COMPUTED TOMOGRAPGY-ASSESSMENT OF SKELETAL MASS
SMM in CLD is commonly assessed using CT, BIA, DXA, and MRI. DXA is widely regarded as reliable and valid, but each method has limitations related to cost, accessibility, and accuracy. A study by Kazuki et al proposed new CT-based SMI cut-offs for pre-sarcopenia in CLD patients, derived from appendicular skeletal muscle index by DXA and AWGS criteria. However, discrepancies exist between AWGS and Japanese Society of Hepatology criteria, with the latter showing low diagnostic accuracy in women (accuracy: 0.682, kappa: 0.402) due to biases in measurement sites and lower muscle quality in women. Hormonal factors, such as estrogen, influence muscle quality, but SMI at the L3 level does not assess this, necessitating improved criteria that consider both muscle mass and quality.
The handgrip strength test is a non-invasive, low-cost sarcopenia screening tool but is affected by instrument variability and testing protocols. SMM measurement, though more reliable, is limited by high cost and invasiveness. Advances in AI could enhance accuracy. Given its importance, routine SMM assessments and pre-sarcopenia diagnoses are valuable for improving clinical outcomes in CLD[22,23]. SMM is a key predictor of prognosis and survival in CLD, with an annual decline of 2.2% in cirrhotic patients reported by Hanai et al[23].
In a study by Georgiou et al[24], 97 caucasian patients with cirrhosis (59.8% male; mean age 59.1 ± 11.6 years; 45.4% with decompensated disease) were evaluated to compare methods for assessing muscle mass. Sarcopenia was assessed using appendicular lean mass index via DXA and SMI at the third lumbar vertebra (L3-SMI) via CT, with five different L3-SMI cutoff criteria. The prevalence of low muscle mass varied, observed in 13.4% of patients using appendicular lean mass index and 26.8-45.4% using the various L3-SMI thresholds. Among these, the cutoffs proposed by Carey et al[25], Prado et al[26], and Montano-Loza et al[27]. Demonstrated similar diagnostic performance, each with a sensitivity of 69.2% and specificities ranging from 75.0% to 79.8%. Notably, the Carey et al[25] criteria showed the highest diagnostic validity when benchmarked against DXA, with a multivariate-adjusted OR of 5.88 (95%CI: 1.36-25.4; P = 0.018), indicating a strong association between CT-based measurements and sarcopenia defined by DXA. Table 3 summarizes the common CT-based cut-off values for sarcopenia diagnosis at the L3 vertebral level.
Table 3 Commonly used computed tomography-based cut-off values for sarcopenia diagnosis at the L3 vertebral level.
Incorporates BMI to adjust thresholds; primarily used in oncology
LT
LT remains the definitive treatment for ESLD and acute liver failure, with eligibility determined by factors beyond age, including cardiopulmonary status, frailty, nutritional status, surgical considerations, and malignancy risks (European Association for the Study of the Liver - guidelines). Early LT may be indicated for select cases such as severe alcohol-related hepatitis, steroid-resistant autoimmune hepatitis, Wilson’s disease with neuropsychiatric symptoms, and unresectable HCC meeting downstaging criteria. However, active malignancies, significant coronary artery disease, or severe pulmonary hypertension refractory to treatment generally preclude transplantation. For patients with cirrhosis and end-stage kidney disease, simultaneous liver-kidney transplantation is recommended, while frailty and portal vein thrombosis require tailored management through rehabilitation, anticoagulation, or transjugular intrahepatic portosystemic shunt. Special considerations include using hepatitis C virus - and hepatitis B virus-positive grafts under antiviral protocols, addressing alcohol relapse prevention, and monitoring metabolic dysfunction-associated steatotic hepatitis recurrence. Long-term management strategies for primary biliary cholangitis and autoimmune hepatitis involve ursodeoxycholic acid and careful immunosuppression adjustments. Improving post-LT outcomes necessitates mental health support, physical activity promotion, and adherence monitoring, particularly among high-risk populations like adolescents. Expanding LDLT programs and refining organ allocation strategies are critical to reducing wait-list mortality[28].
Since its first successful performance in 1967, LT has evolved from an experimental procedure to a standard therapy, significantly improving survival rates from 25% in the early years to over 80% following the introduction of calcineurin inhibitors in the 1980s (Ifrah Fatima). The MELD score, implemented in 2002, stratifies patients based on three-month mortality risk, ranging from 2% for MELD < 9 to 71.3% for MELD > 40. The 2013 share 35 policy improved post-transplant survival for cirrhotic patients with MELD ≥ 35, though its impact on wait-list mortality varied across racial groups and HCC patients. The more recent MELD 3.0, incorporating sex and serum albumin while capping creatinine at 3 mg/dL, has improved LT access for women and reduced wait-list mortality. However, patients with ACLF remain under-prioritized despite their high mortality risk, with only 0.7% (ACLF-1), 1.9% (ACLF-2), and 2.7% (ACLF-3) receiving transplants, increasing marginally over six months. ACLF patients with MELD < 25 experience disproportionately high wait-list mortality (33%-40%) compared to non-ACLF counterparts (< 10%). MELD scoring may be more predictive in ACLF cases meeting Asian Pacific Association for the Study of Liver Disease criteria, where a ≥ 2-point MELD increase within two weeks correlates with a 60-day survival prediction, particularly relevant in regions with a high prevalence of LDLT[29].
IMPACT OF SARCOPENIA IN LT
Sarcopenia is a recognized risk factor for mortality among LT candidates, as shown in studies such as Tandon et al[30] analysis of 142 patients, where sarcopenia independently increased mortality risk 2.4-fold, particularly in those with MELD < 15. Durand et al[31] reported a 15% rise in mortality risk per unit decrease in transverse psoas muscle thickness/height ratio, while Carey et al[25] multicenter study linked lower L3-SMI to reduced survival (HR = 0.95 per 1 cm2/m2 decrease, 95%CI: 0.94-0.97). A meta-analysis further confirmed an increased mortality risk in sarcopenic patients (pooled HR = 1.72, 95%CI: 0.99-3.00, P = 0.05). Additionally, sarcopenia is associated with higher healthcare costs, with van Vugt et al[32] reporting a median cost of 12289 dollars for sarcopenic patients vs 7484 dollars for non-sarcopenic patients.
However, recent studies emphasize muscle function over mass as a predictor of mortality. Yadav et al[33] found no significant link between sarcopenia and mortality but observed a trend with six-minute walk distance < 250 m (HR = 2.1, 95%CI: 0.9-4.7). Similarly, Wang et al[34] reported grip strength (HR = 0.74 per 5 kg increase, 95%CI: 0.59-0.92) and SPPB score (HR = 0.89 per point increase, 95%CI: 0.82-0.97) as better mortality predictors than muscle mass. These findings suggest that integrating functional assessments with muscle mass metrics may improve risk stratification for LT candidates.
Sarcopenia has been consistently linked to poor post-transplantation outcomes, including increased mortality, extended ICU and hospital stays, and higher rates of infection and sepsis. Sarcopenic patients are estimated to have a 4.8-times higher risk of death within one year after LT compared to non-sarcopenic patients. A 2016 meta-analysis by van Vugt et al[32] incorporating 11 studies, reported a pooled HR of 1.84 (95%CI: 1.11-3.05) for mortality in sarcopenic individuals. Furthermore, sarcopenia has been identified as a risk factor for failure to rescue and poor pulmonary outcomes. For instance, Underwood et al[35] found that patients in the lowest tertile of total psoas area had a 1.4-fold higher adjusted complication rate (91% vs 66%) and a 2.8-fold higher failure-to-rescue rate (22% vs 8%, P < 0.001) within the first post-transplant y. Wada et al[36] demonstrated an association between reduced preoperative psoas volume and postoperative pneumonia, prolonged ventilation, and tracheostomy requirements.
Emerging research highlights the impact of de novo sarcopenia after transplantation. Jeon et al[37] observed that 15% of patients without preoperative sarcopenia developed sarcopenia within a year post-transplant, with a corresponding HR of 10.53 for mortality (95%CI: 1.37-80.93). Similarly, Chae et al[38] identified a postoperative decrease in psoas muscle index of ≤ 11.7% as an independent predictor of mortality (HR = 1.87, 95%CI: 1.07-3.25), occurring in 25% of patients. These findings underscore the importance of recognizing both preoperative and perioperative sarcopenia and suggest that tailored interventions targeting muscle mass and function in ESLD patients could improve post-transplantation outcomes[39].
OUTCOMES AND QOL IN POST- LT PATIENTS
Multiple studies emphasize the critical role of muscle mass and quality in post-LT outcomes. Reduced psoas area has been strongly linked to increased mortality (HR = 3.7 per 1000 mm2 decrease, P < 0.0001). Low psoas muscle index (OR = 3.635, P < 0.001) and high intramuscular fat (OR = 3.898, P < 0.001) are independent mortality risk factors. Sarcopenia is associated with lower survival rates (P < 0.001), higher 12-month mortality risk (OR = 0.996), prolonged hospital stays, and increased postoperative sepsis (17.7% vs 7.4%, P = 0.03). Patients with SMI ≤ 48 cm2/m2 have reduced one-year (86% vs 95%) and three-year survival (73% vs 95%, P = 0.01). Additionally, sarcopenic patients experience longer hospital stays (P = 0.005) and more frequent bacterial infections post-LT (26% vs 15%, P = 0.04). While muscle health significantly affects recovery and survival, its impact on overall mortality remains debated[40].
QoL post LT
The median age of the study population was 54 years (27-67 years), with 85% (n = 17) being male. The median follow-up period was 24 months (6-66 months). All patients had advanced liver disease, classified as Child B (60%, n = 12) or Child C (40%, n = 8), with a median Child score of 10 (7-13). The median MELD score was 17 (12-26), and 75% (n = 15) had cryptogenic cirrhosis as the indication for LT. Control groups A and B had comparable baseline characteristics. Group A had a median age of 54 years (34-67), with 67.5% male, while group B had a median age of 54 years (28-72), with 85% male; however, all group B participants were non-cirrhotic. Post-LT patients experienced significant improvements in QoL compared to those with ESLD, with better physical health (80% vs 7.9%, P < 0.01), emotional well-being (80% vs 61.1%, P < 0.01), and social functioning (86.9% vs 56.9%, P < 0.01). Compared to a non-cirrhotic control group, their QoL was similar, except for better emotional well-being (P = 0.01). At six months post-LT, 15% experienced graft rejection, 25% had infections requiring hospitalization, and 30% had drug-related complications, yet 95% maintained good drug compliance. Marital status and higher education were associated with better QoL, while gender had no significant effect. With post-LT survival exceeding 80% at five years and 70% at ten years, the focus has shifted to long-term QoL. Sri Lanka’s state-funded LT program faces challenges due to resource constraints, but strong family support and high literacy rates contribute to positive patient outcomes. Despite a 26.6% perioperative mortality rate, mainly from sepsis and primary non-function, LT recipients reported a QoL comparable to healthy individuals. However, their perceived well-being may lead to poor long-term drug compliance, emphasizing the need for continued post-transplant monitoring and support[41].
QoL generally improves after LT, but long-term outcomes vary. While psychological and personal function improvements are often sustained, physical distress, social functioning, and general health perception may decline after 12 years. Factors linked to worse post-transplant QoL include HCC, deceased circulatory death donor organs, and liver disease due to hepatitis C virus or alcoholism. A study of 61 recipients found that poor body image was associated with a sedentary lifestyle and lower QoL. Physical activity plays a key role in enhancing QoL post-transplant. Increased mobility, self-care, and higher total energy expenditure correlate with better QoL, with benefits lasting up to five years. While randomized trial evidence is limited, a study of 151 LT recipients found that those in an exercise program had significantly better QoL scores than those receiving usual care. Social support also contributes to well-being, with group activities like the Transplant Games positively impacting physical and emotional health. These activities foster teamwork, self-esteem, and confidence, helping recipients transition from a sedentary pre-transplant state to an active lifestyle[42].
IMPLEMENTING AI, MACHINE LEARNING AND DEEP LEARNING
AI is increasingly being adopted across various fields, including healthcare, where machine learning (ML) has gained recent attention for its potential in predictive modeling. ML models use mathematical functions and rules to classify and predict outcomes with high accuracy. Deep learning, a subset of ML, employs deep neural networks that simulate brain-like processing through multiple layers of artificial neurons. Unlike traditional statistical models, ML efficiently captures complex, non-linear relationships.
The growing availability of electronic health records has fueled the development of ML models for LT. Several studies have applied ML to predict WL mortality and post-transplant outcomes. For instance, Nagai et al[43] developed a neural network model for predicting 90-day LT WL mortality, while Bertsimas et al[44] used optimal classification trees for similar predictions. Compared to the MELD-based allocation, the “optimized prediction of mortality” model showed a potential reduction of approximately 418 WL deaths per year and achieved an area under the receiver operating characteristic curve (AUROC) of 0.859, compared to 0.841 for MELD-Na. Kwong et al[45] further developed a random forest (RF) model identifying key predictive variables, including ascites, tumor size, alpha-fetoprotein, bilirubin, and INR, with a concordance statistic of 0.74 for predicting 3-, 6-, and 12-month WL dropout.
Beyond clinical parameters, psychosocial factors are critical in transplant candidacy. The Stanford Integrated Psychosocial Assessment for Transplant is widely used to assess transplant eligibility. Recently, an AI model using 13 psychosocial variables was developed to predict harmful alcohol use post-LT, with a gradient boosting decision tree achieving a superior positive predictive value (0.820). AI is also being explored for liver graft assessment, such as a support vector machine-based model that identified steatotic grafts with 93% sensitivity and 89% accuracy using smartphone images[46,47].
A single-center observational study, conducted by Mauro et al[48], to identify clinical factors independently associated with sarcopenia, as determined by the SMI, and to develop a predictive score for assessing sarcopenia in patients listed for LT. Binary logistic regression was utilized to determine independent predictors of sarcopenia, and the Sarcopenia Hospital Italiano de Buenos Aires (HIBA) score was derived from this model. Internal validation was performed using bootstrapping with correction for optimism. The prognostic utility of the score for predicting WL mortality was assessed via competing risk regression analysis. The results of the study conducted on a total of 215 cirrhotic patients who were on the LT WL. Independent predictors of sarcopenia included male sex (OR = 6.09; P < 0.001), lower body mass index (OR = 0.74; P < 0.001), higher Child-Pugh score (OR = 1.44; P < 0.001), and a lower creatinine/cystatin C ratio (OR = 0.03; P = 0.007). The Sarcopenia HIBA score, constructed from these variables, demonstrated excellent discriminative ability with an AUROC of 0.862. During follow-up, 77 patients (36%) underwent LT, 46 (21%) died while on the WL, and 92 (43%) remained alive. After adjusting for the MELD-Na, the Sarcopenia HIBA score remained an independent predictor of WL mortality (sub-HR = 1.19; 95%CI: 1.01-1.40; P = 0.042)[48,49].
Several studies have explored the use of ML in predicting LT outcomes. Hoot et al[46] applied a Bayesian network to analyze 29 pre-transplant variables from 6164 United Network for Organ Sharing database transplants, achieving an AUROC of 0.674-0.681. Maldonado et al[47] used logistic regression to predict psychosocial outcomes in 102 transplant cases with Stanford Integrated Psychosocial Assessment for Transplant scores, reporting a Pearson’s correlation coefficient of 0.853. Cruz-Ramírez et al[50] employed a rule-based artificial neural network (ANN) to assess 3-month post-LT survival in 1031 recipients. Khosravi et al[51] applied an ANN to predict 5-year survival in 1168 recipients, achieving 92.73% accuracy and an AUROC of 0.864. Andres et al[52] utilized a custom ML model to predict survival in 2769 patients with primary sclerosing cholangitis. Guijo-Rubio et al[53] analyzed up to 9-year survival in 39095 donor-recipient pairs using Cox regression and gradient boosting, with AUROCs ranging from 0.5668-0.6079. Kazemi et al[54] evaluated 6-month survival in 902 recipients, with support vector machine achieving an AUROC of 0.87 and an accuracy of 0.96. Wadhwani et al[55] predicted 3-year post-LT outcomes in 887 pediatric patients using a RF model, achieving 71% accuracy. Molinari et al[56] used ANN and classification trees to predict 3-month mortality in 30458 recipients, reporting an AUROC of 0.952. Liu et al[57] used RF on serum data from 538 recipients to predict 30-day post-op mortality, achieving an AUROC of 0.771. Hernaez et al[58] assessed MELD-Na’s ability to predict 90-day mortality in 71894 ACLF patients, revealing an overall standardized mortality ratio of 1.7. Yasodhara et al[59] used gradient boosting and Cox portal hypertention models to predict 5- and 10-year post-LT mortality in 18058 patients, with AUROCs of 0.60-0.70. Ershoff et al[60] employed a deep neural network on 57544 recipients, achieving an AUROC of 0.703. Kantidakis et al[61] used ANN to model patient survival trajectories in 62294 United Network for Organ Sharing patients, incorporating 97 donor-recipient characteristics. Kanwal et al[62] developed an ML model to predict cirrhosis mortality in 107939 patients, achieving an AUROC of 0.78 for 1-year mortality. Nitski et al[63] used deep learning models on 42146 Scientific registry of transplant recipients recipients, with external validation on 3269 recipients, reporting Transformer AUROCs of 0.804 (1-year) and 0.733 (5-year). Hakeem et al[64] assessed post-LT survival in 818 LDLT recipients, identifying increased risks of diabetes, hypertension, and mortality in older patients. These studies highlight the evolving role of ML in improving LT prognosis and risk assessment.
Studies utilizing ML to predict graft rejection and failure in LT have explored various models, datasets, and predictive variables. Hughes et al[65] applied an ANN to clinical and biochemical data from 117 recipients, achieving an AUROC of 0.902 for predicting acute rejection at three months post-transplant. Parmanto et al[66] used recurrent neural networks with time-series clinical observations from 293 cases to predict 90-day graft failure. Rajanayagam et al[67] focused on pediatric recipients, analyzing 34 biochemical variables with ANN to predict acute liver failure, achieving a testing accuracy of 78%. Dorado-Moreno et al[68] developed an ordinal ANN based on donor and recipient characteristics to forecast short- and long-term graft survival, reporting an AUROC of 0.96. Zare et al[69] used a multilayer perceptron model incorporating biochemical data to predict acute graft rejection, achieving 90% accuracy. Lau et al[70] employed ANN with donor-recipient parameters selected via RF analysis, yielding an AUROC of 0.835 for 30-day graft failure. Ayllón et al[71] validated their ANN model externally with 822 recipients, obtaining AUROC values of 0.94 for three-month and 0.78 for twelve-month graft survival predictions. Across these studies, significant predictors included recipient disease etiology, MELD score, donor sodium levels, liver function markers, and perioperative factors, underscoring the potential of ML in improving post-transplant outcomes[72].
DISCUSSION
Sarcopenia and frailty are increasingly recognized as interrelated and clinically significant syndromes in patients with CLD and those undergoing LT. Both conditions are independently associated with a spectrum of adverse outcomes, including increased WL mortality, higher incidence of postoperative complications, prolonged hospital stays, and decreased long-term survival rates. Their presence represents a critical component in the comprehensive assessment and risk stratification of LT candidates.
The pathophysiological mechanisms underlying sarcopenia and frailty in CLD are multifactorial and include chronic inflammation, hyperammonemia, hormonal imbalances (e.g., insulin resistance, altered growth hormone and testosterone levels), and impaired nutrient metabolism. These disruptions lead to progressive muscle wasting and functional decline, ultimately reducing physiological reserve. Moreover, sarcopenia and frailty frequently coexist and interact, compounding their individual effects on clinical outcomes.
Validated assessment tools have been instrumental in advancing clinical practice. The use of functional tests such as handgrip strength, gait speed, and the SARC-F questionnaire, combined with imaging modalities like CT, DXA, and BIA, has enhanced the ability to detect sarcopenia and monitor its progression. Composite indices, particularly the LFI, have emerged as robust predictors of adverse outcomes and are now incorporated into many transplant evaluation protocols.
Recent studies highlight the dynamic nature of sarcopenia, with data demonstrating that muscle mass can continue to decline during the perioperative period. This underscores the importance of longitudinal monitoring and timely intervention strategies. AI is an emerging field with the potential to transform sarcopenia and frailty assessment through predictive modeling, automated image analysis, and the development of personalized risk stratification tools. Several pilot studies have demonstrated the feasibility and accuracy of AI-based CT and MRI analyses in quantifying muscle mass and identifying high-risk patients.
Future research should prioritize multicenter, prospective studies validating combined measures of muscle mass and function, alongside elucidation of the molecular and genetic pathways involved. The exploration of targeted interventions - including nutritional supplementation, structured physical rehabilitation programs, and pharmacologic agents such as myostatin inhibitors - offers promising therapeutic avenues. Additionally, the development of AI-guided clinical decision-making tools, and consideration of age- and sex-specific differences, may further refine individualized treatment strategies.
CONCLUSION
In conclusion, sarcopenia and frailty are modifiable risk factors with significant prognostic implications in LT candidates. A multidimensional approach integrating functional, anatomical, and technological assessments holds promise for improving transplant candidacy evaluations, optimizing perioperative management, and ultimately enhancing survival and QoL in patients with advanced liver disease.
Footnotes
Provenance and peer review: Invited article; Externally peer reviewed.
Peer-review model: Single blind
Specialty type: Gastroenterology and hepatology
Country of origin: Greece
Peer-review report’s classification
Scientific Quality: Grade B, Grade B
Novelty: Grade B, Grade C
Creativity or Innovation: Grade B, Grade C
Scientific Significance: Grade B, Grade B
P-Reviewer: Li ZP S-Editor: Bai Y L-Editor: A P-Editor: Zhao YQ
Dhaliwal A, Williams, FR, El-sherif, O, Armstrong MJ. Sarcopenia in Liver Transplantation: an Update.Curr Hepatology Rep. 2020;19:128-137.
[PubMed] [DOI] [Full Text]
Cruz-Jentoft AJ, Bahat G, Bauer J, Boirie Y, Bruyère O, Cederholm T, Cooper C, Landi F, Rolland Y, Sayer AA, Schneider SM, Sieber CC, Topinkova E, Vandewoude M, Visser M, Zamboni M; Writing Group for the European Working Group on Sarcopenia in Older People 2 (EWGSOP2), and the Extended Group for EWGSOP2. Sarcopenia: revised European consensus on definition and diagnosis.Age Ageing. 2019;48:16-31.
[RCA] [PubMed] [DOI] [Full Text] [Full Text (PDF)][Cited by in Crossref: 6646][Cited by in RCA: 7495][Article Influence: 1249.2][Reference Citation Analysis (0)]
Markakis GE, Lai JC, Karakousis ND, Papatheodoridis GV, Psaltopoulou T, Merli M, Sergentanis TN, Cholongitas E. Sarcopenia As a Predictor of Survival and Complications of Patients With Cirrhosis After Liver Transplantation: A Systematic Review and Meta-Analysis.Clin Transplant. 2025;39:e70088.
[RCA] [PubMed] [DOI] [Full Text][Reference Citation Analysis (0)]
Dent E, Morley JE, Cruz-Jentoft AJ, Arai H, Kritchevsky SB, Guralnik J, Bauer JM, Pahor M, Clark BC, Cesari M, Ruiz J, Sieber CC, Aubertin-Leheudre M, Waters DL, Visvanathan R, Landi F, Villareal DT, Fielding R, Won CW, Theou O, Martin FC, Dong B, Woo J, Flicker L, Ferrucci L, Merchant RA, Cao L, Cederholm T, Ribeiro SML, Rodríguez-Mañas L, Anker SD, Lundy J, Gutiérrez Robledo LM, Bautmans I, Aprahamian I, Schols JMGA, Izquierdo M, Vellas B. International Clinical Practice Guidelines for Sarcopenia (ICFSR): Screening, Diagnosis and Management.J Nutr Health Aging. 2018;22:1148-1161.
[RCA] [PubMed] [DOI] [Full Text][Cited by in Crossref: 518][Cited by in RCA: 606][Article Influence: 86.6][Reference Citation Analysis (0)]
Georgiou A, Papatheodoridis GV, Alexopoulou A, Deutsch M, Vlachogiannakos I, Ioannidou P, Papageorgiou MV, Papadopoulos N, Yannakoulia M, Kontogianni MD. Validation of cutoffs for skeletal muscle mass index based on computed tomography analysis against dual energy X-ray absorptiometry in patients with cirrhosis: the KIRRHOS study.Ann Gastroenterol. 2020;33:80-86.
[RCA] [PubMed] [DOI] [Full Text] [Full Text (PDF)][Cited by in Crossref: 5][Cited by in RCA: 7][Article Influence: 1.2][Reference Citation Analysis (0)]
Wada Y, Kamishima T, Shimamura T, Kawamura N, Yamashita K, Sutherland K, Takeda H. Pre-operative volume rather than area of skeletal muscle is a better predictor for post-operative risks for respiratory complications in living-donor liver transplantation.Br J Radiol. 2017;90:20160938.
[RCA] [PubMed] [DOI] [Full Text][Cited by in Crossref: 11][Cited by in RCA: 15][Article Influence: 1.9][Reference Citation Analysis (0)]
Siriwardana R, Gunetilleke B, Jayatunge S, Weerasooriya A, Niriella M, Dassanayake A, Ranaweera S, Tillakaratne S. The long-term quality of life following liver transplantation in a developing country with a free health care system.Ceylon Med J. 2022;67:89-93.
[RCA] [PubMed] [DOI] [Full Text][Cited by in RCA: 1][Reference Citation Analysis (0)]