Objective: During the COVID-19 pandemic, central venous access line teams were implemented at many hospitals throughout the world to provide access for critically ill patients. The objective of this study was to describe the structure, practice patterns, and outcomes of these vascular access teams during the COVID-19 pandemic. Methods: We conducted a cross-sectional, self-reported study of central venous access line teams in hospitals afflicted with the COVID-19 pandemic. To participate in the study, hospitals were required to meet one of the following criteria: development of a formal plan for a central venous access line team during the pandemic; implementation of a central venous access line team during the pandemic; placement of central venous access by a designated practice group during the pandemic as part of routine clinical practice; or management of an iatrogenic complication related to central venous access in a patient with COVID-19. Results: Participants from 60 hospitals in 13 countries contributed data to the study. Central venous line teams were most commonly composed of vascular surgery and general surgery attending physicians and trainees. Twenty sites had 2657 lines placed by their central venous access line team or designated practice group. During that time, there were 11 (0.4%) iatrogenic complications associated with central venous access procedures performed by the line team or group at those 20 sites. Triple lumen catheters, Cordis (Santa Clara, Calif) catheters, and nontunneled hemodialysis catheters were the most common types of central venous lines placed by the teams. Eight (14%) sites reported experience in placing central venous lines in prone, ventilated patients with COVID-19. A dedicated line cart was used by 35 (59%) of the hospitals. Less than 50% (24 [41%]) of the participating sites reported managing thrombosed central lines in COVID-19 patients. Twentythree of the sites managed 48 iatrogenic complications in patients with COVID-19 (including complications caused by providers outside of the line team or designated practice group). Conclusions: Implementation of a dedicated central venous access line team during a pandemic or other health care crisis is a way by which physicians trained in central venous access can contribute their expertise to a stressed health care system. A line team composed of physicians with vascular skill sets provides relief to resource-constrained intensive care unit, ward, and emergency medicine teams with a low rate of iatrogenic complications relative to historical reports. We recommend that a plan for central venous access line team implementation be in place for future health care crises.
Sepsis remains a major public health concern, characterized by marked immune dysfunction. Innate lymphoid cells develop from a common lymphoid precursor but have a role in orchestrating inflammation during innate response to infection. Here, we investigate the pathologic contribution of the group 2 innate lymphoid cells (ILC2s) in a murine model of acute septic shock (cecal ligation and puncture). Flow cytometric data revealed that ILC2s increase in number and percentage in the small intestine and in the peritoneal cells and inversely decline in the liver at 24 hours after septic insult. Sepsis also resulted in changes in ILC2 effector cytokine (IL-13) and activating cytokine (IL-33) in the plasma of mice and human patients in septic shock. Of interest, the sepsis-induced changes in cytokines were abrogated in mice deficient in functionally invariant natural killer T cells. Mice deficient in IL-13-producing cells, including ILC2s, had a survival advantage after sepsis along with decreased morphologic evidence of tissue injury and reduced IL-10 levels in the peritoneal fluid. Administration of a suppressor of tumorigenicity 2 (IL-33R) receptor-blocking antibody led to a transient survival advantage. Taken together, these findings suggest that ILC2s may play an unappreciated role in mediating the inflammatory response in both mice and humans; further, modulating ILC2 response in vivo may allow development of immunomodulatory strategies directed against sepsis.
The liver is an organ that, when dysfunctional in a septic patient, is strongly associated with morbidity and mortality. Understanding the pathophysiology of liver failure during sepsis may lead to improved diagnostics and potential therapeutic targets. Historically, programmed cell death receptor (PD) ligand 1 (PD-L1) has been considered the primary ligand for its checkpoint molecule counterpart, PD-1, with PD-L2 rarely in the immunopathological spotlight. PD-1 and PD-L1 contribute to liver dysfunction in a murine cecal ligation and puncture (CLP) model of sepsis, but virtually nothing is known about PD-L2’s role in sepsis. Therefore, our central hypothesis was that sepsis-induced changes in hepatic PD-L2 expression contributed to worsened liver function and, subsequently, more pronounced morbidity and mortality. We found that although PD-L1 gene deficiency attenuated the hepatic dysfunction seen in wild-type mice after CLP, the loss of PD-L2 appeared to actually worsen indices of liver function along with a trend toward higher liver tissue vascular permeability. Conversely, some protective effects of PD-L2 gene deletion were noted, such as reduced liver/peritoneal bacterial load and reduced IL-6, IL-10, and macrophage inflammatory protein 2 levels following CLP. These diverse actions, as well as the unique expression pattern of PD-L2, may explain why no overt survival advantage could be witnessed in the septic PD-L2−/− mice. Taken together, these data suggest that although PD-L2 has some selective effects on the hepatic response seen in the septic mouse, these factors are not sufficient to alter septic mortality in this adult murine model. NEW & NOTEWORTHY Our study shows not only that ligands of the checkpoint protein PD-1 respond inversely to a stressor such as septic challenge (PD-L2 declines, whereas PD-L1 rises) but also that aspects of liver dysfunction increase in septic mice lacking the PD-L2 gene. Furthermore, these differences in PD-L2 gene-deficient animals culminated in the abrogation of the survival advantage seen in the septic PD-L1-knockout mice, suggesting that PD-L2 may have roles beyond a simple immune tolerogen.
We have shown that invariant natural killer T (iNKT) cells mediate sepsis-induced end-organ changes and immune responses, including macrophage bacterial phagocytosis, a finding regulated by the check point protein program cell death receptor-1 (PD-1). Furthermore, PD-1 mediates mortality in both adult and neonatal murine sepsis as well as in surgical patients. Given our previous findings, we hypothesize that iNKT cells will also modulate neonatal sepsis survival, and that this effect is regulated in part through PD-1. We utilized a polymicrobial intra-peritoneal cecal slurry (CS) sepsis model in wild type (WT), iNKT−/− or PD-1−/− 5–7 day old neonatal pups. Typically, tissues were harvested at 24 h for various bioassays/histology and, in some cases, survival was assessed for up to 7 days. Interestingly, similar to what we recently reported for PD-1−/− mice following CS, iNKT−/−-deficient animals exhibit a markedly improved survival vs. WT. Histologically, minor alterations in liver architectural, which were noted in WT pups, were attenuated in both iNKT−/− and PD-1−/− pups. Following CS, PECAM-1 expression was unchanged in the WT pups but increased in both iNKT−/− and PD-1−/− pups. In WT, following CS the emergence of a Ly6Clow subpopulation was noted among the influxed peritoneal macrophage population. Conversely, within iNKT−/− pups, there were fewer peritoneal macrophages and a greater percentage of Ly6Chigh macrophages. We show not only a key role for iNKT cells in affecting end-organ damage as well as alterations in phagocytes phenotypes in neonatal sepsis but that this iNKT cell mediated effect is driven by the central checkpoint protein PD-1.
This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
PurposeWe investigated the metabolic response of lung cancer to radiotherapy or chemoradiotherapy by 18F-FDG PET and its utility in guiding timely supplementary therapy.MethodsGlucose metabolic rate (MRglc) was measured in primary lung cancers during the 3 weeks before, and 10–12 days (S2), 3 months (S3), 6 months (S4), and 12 months (S5) after radiotherapy or chemoradiotherapy. The association between the lowest residual MRglc representing the maximum metabolic response (MRglc-MMR) and tumor control probability (TCP) at 12 months was modeled using logistic regression.ResultsWe accrued 106 patients, of whom 61 completed the serial 18F-FDG PET scans. The median values of MRglc at S2, S3 and S4 determined using a simplified kinetic method (SKM) were, respectively, 0.05, 0.06 and 0.07 μmol/min/g for tumors with local control and 0.12, 0.16 and 0.19 μmol/min/g for tumors with local failure, and the maximum standard uptake values (SUVmax) were 1.16, 1.33 and 1.45 for tumors with local control and 2.74, 2.74 and 4.07 for tumors with local failure (p < 0.0001). MRglc-MMR was realized at S2 (MRglc-S2) and the values corresponding to TCP 95 %, 90 % and 50 % were 0.036, 0.050 and 0.134 μmol/min/g using the SKM and 0.70, 0.91 and 1.95 using SUVmax, respectively. Probability cut-off values were generated for a given level of MRglc-S2 based on its predicted TCP, sensitivity and specificity, and MRglc ≤0.071 μmol/min/g and SUVmax ≤1.45 were determined as the optimum cut-off values for predicted TCP 80 %, sensitivity 100 % and specificity 63 %.ConclusionThe cut-off values (MRglc ≤0.071 μmol/min/g using the SKM and SUVmax ≤1.45) need to be tested for their utility in identifying patients with a high risk of residual cancer after standard dose radiotherapy or chemoradiotherapy and in guiding a timely supplementary dose of radiation or other means of salvage therapy.Electronic supplementary materialThe online version of this article (doi:10.1007/s00259-013-2348-4) contains supplementary material, which is available to authorized users.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.