Article

Using Time Dependent Covariates and Time Dependent Coefficients in the Cox Model

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... We calculated and plotted attack rates for the vaccinated and unvaccinated populations for each day of our analysis for each vaccine type. As is commonly used in vaccine effectiveness analysis (12)(13)(14)(15)(16), we fitted standard timedependent Cox proportional hazards regression models using the coxph function from the Survival package (the specific function code was: coxph(Surv(timestart, timestop, covidstatus)) ~ vaccinestatus, data = timedependentdataframe, ties = "breslow") to calculate hazards ratios for COVID-19 by vaccination status and vaccine type (17)(18)(19)(20)(21). Vaccination status was a time-dependent covariate that changed to partially vaccinated when the person received their 1st dose and to fully vaccinated at 14 days after their 2nd dose. We conducted analysis for all vaccines, and separately by vaccine type. ...
... We tested for the proportional hazards (PH) assumption using Schoenfeld residuals, which tests whether there is a relationship between residuals and time, using cox.zph function from the survival package in R (16). All models met this assumption. ...
Article
Full-text available
Background In February 2021 Kazakhstan began offering COVID-19 vaccines to adults. Breakthrough SARS-CoV-2 infections raised concerns about real-world vaccine effectiveness. We aimed to evaluate effectiveness of four vaccines against SARS-CoV-2 infection. Methods We conducted a retrospective cohort analysis among adults in Almaty using aggregated vaccination data and individual-level breakthrough COVID-19 cases (≥14 days from 2nd dose) using national surveillance data. We ran time-adjusted Cox-proportional-hazards model with sensitivity analysis accounting for varying entry into vaccinated cohort to assess vaccine effectiveness for each vaccine (measured as 1-adjusted hazard ratios) using the unvaccinated population as reference (N = 565,390). We separately calculated daily cumulative hazards for COVID-19 breakthrough among vaccinated persons by age and vaccination month. Results From February 22 to September 1, 2021, in Almaty, 747,558 (57%) adults were fully vaccinated (received 2 doses), and 108,324 COVID-19 cases (11,472 breakthrough) were registered. Vaccine effectiveness against infection was 79% [sensitivity estimates (SE): 74%–82%] for QazVac, 77% (SE: 71%–81%) for Sputnik V, 71% (SE: 69%–72%) for Hayat-Vax, and 70% (SE: 65%–72%) for CoronaVac. Among vaccinated persons, the 90-day follow-up cumulative hazard for breakthrough infection was 2.2%. Cumulative hazard was 2.9% among people aged ≥60 years versus 1.9% among persons aged 18–39 years (p < 0.001), and 1.2% for people vaccinated in February–May versus 3.3% in June–August (p < 0.001). Conclusion Our analysis demonstrates high effectiveness of COVID-19 vaccines against infection in Almaty similar to other observational studies. Higher cumulative hazard of breakthrough among people ≥60 years of age and during variant surges warrants targeted booster vaccination campaigns.
... To quantify the extent to which injuries are associated with an individual's survival, we used time-dependent mixed-effects cox models 53,54 . Animals that were injured were nearly three times more likely to die in the two months following an injury than animals that were not injured, independent of the reproductive season when the injury occurred ( Table S1). ...
... To establish the effect of injuries on survival we used time-dependent Cox proportional hazard (PH) models. 53 For the analyses, we used the whole dataset (n = 1061), including injured and uninjured animals from both sexes. Animals that were removed from the population or that were still alive at the end of the study period were censored ( Figure S1 for details on sample size). ...
Article
Full-text available
Sociality has been linked to a longer lifespan in many mammals, including humans. Yet how sociality results in survival benefits remains unclear. Using 10 years of data and over 1000 recorded injuries in rhesus macaques (Macaca mulatta), we tested two injury-related mechanisms by which social status and affiliative partners might influence survival. Injuries increased individual risk of death by 3-fold in this dataset. We found that sociality can affect an individual’s survival by reducing their risk of injury but had no effect on the probability of injured individuals dying. Both males and females of high social status (measured as female matrilineal rank, male group tenure) and females with more affiliative partners (estimated using number of female relatives) experienced fewer injuries and thus were less likely to die. Collectively, our results offer rare insights into one mechanism that can mediate the well-known benefits of sociality on an individual’s fitness.
... 20 Patients were considered off anticoagulant treatment if they received prophylactic or intermediate doses of LMWH. Univariable Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals (95% CIs) for the associations between study outcomes and the following risk factors chosen based on previous studies: anticoagulant treatment status (on-treatment vs. off-treatment), age, sex, previous bleeding, cancer disease (active cancer and history of cancer, with the latter defined as solid or hematological cancer within the previous 5 years and not receiving active therapy), type of index gastrointestinal bleeding (major vs. clinically relevant non-major bleeding), endoscopy (upper endoscopy, colonoscopy, or both), supportive treatment (including one or more of the following: red cell or platelet blood transfusion, fresh frozen plasma, vitamin K, prothrombin complex concentrate, or a DOAC antidote), indication for anticoagulation (atrial fibrillation, prosthetic heart valves, VTE), and glomerular filtration rate (GFR, calculated with the Chronic Kidney DiseaseEpidemiology Collaboration [CKD-EPI] formula assuming that all patients were non-Black) at admission.[20][21][22] All these variables were subsequently included in multivariable Cox proportional hazards models.[20][21][22] ...
... Univariable Cox proportional hazards models were used to estimate hazard ratios (HRs) and 95% confidence intervals (95% CIs) for the associations between study outcomes and the following risk factors chosen based on previous studies: anticoagulant treatment status (on-treatment vs. off-treatment), age, sex, previous bleeding, cancer disease (active cancer and history of cancer, with the latter defined as solid or hematological cancer within the previous 5 years and not receiving active therapy), type of index gastrointestinal bleeding (major vs. clinically relevant non-major bleeding), endoscopy (upper endoscopy, colonoscopy, or both), supportive treatment (including one or more of the following: red cell or platelet blood transfusion, fresh frozen plasma, vitamin K, prothrombin complex concentrate, or a DOAC antidote), indication for anticoagulation (atrial fibrillation, prosthetic heart valves, VTE), and glomerular filtration rate (GFR, calculated with the Chronic Kidney DiseaseEpidemiology Collaboration [CKD-EPI] formula assuming that all patients were non-Black) at admission.[20][21][22] All these variables were subsequently included in multivariable Cox proportional hazards models.[20][21][22] Proportional hazards assumptions were checked evaluating Schoenfeld residuals. ...
Article
Full-text available
Background Gastrointestinal bleeding frequently complicates anticoagulant therapy causing treatment discontinuation. Data to guide the decision regarding whether and when to resume anticoagulation based on the risks of thromboembolism and recurrent bleeding are scarce. Therefore, we aimed to retrospectively evaluate the incidence of these events after anticoagulant-related gastrointestinal bleeding and assess their relationship with timing of anticoagulation resumption. Methods Patients hospitalized because of gastrointestinal bleeding during oral anticoagulation for any indication were eligible. All patients were followed up to 2 years after the index bleeding for recurrent major or clinically relevant non-major bleeding, venous or arterial thromboembolism, and mortality. Results We included 948 patients hospitalized for gastrointestinal bleeding occurring during treatment with vitamin K antagonists (n=531) or direct oral anticoagulants (n=417). In time-dependent analysis, anticoagulant treatment was associated with a higher risk of recurrent clinically relevant bleeding (hazard ratio [HR] 1.55; 95% confidence interval [CI] 1.08 to 2.22), but lower risk of thromboembolism (HR 0.34; 95% CI 0.21 to 0.55), and death (HR 0.50; 95% CI 0.36 to 0.68). Previous bleeding, index major bleeding, and lower glomerular filtration rate were associated with a higher risk of recurrent bleeding. The incidence of recurrent bleeding increased after anticoagulation restart independently of timing of resumption. Conclusions Anticoagulant treatment after gastrointestinal bleeding is associated with a lower risk of thromboembolism and death, but higher risk of recurrent bleeding. The latter seemed to be influenced by patient characteristics and less impacted by time of anticoagulation resumption.
... An updated lifetime prediction, conditional on a patient's current state, can be made per year, using age-specific coefficients. We use these updated predictions as covariates in a time-dependent extended model 31,32 to evaluate the performance of our model on predicting time-to-event ("Methods"). Though the scores arise from a non-Cox model, since the Cox model score statistic is well defined for timedependent covariates, the concordance is also well defined for a timedependent risk score: at each event time the current risk score of the subject who failed is compared to the current (time-dependent) scores of all those still at risk. ...
Article
Full-text available
Coronary artery disease (CAD) is the leading cause of death among adults worldwide. Accurate risk stratification can support optimal lifetime prevention. Current methods lack the ability to incorporate new information throughout the life course or to combine innate genetic risk factors with acquired lifetime risk. We designed a general multistate model (MSGene) to estimate age-specific transitions across 10 cardiometabolic states, dependent on clinical covariates and a CAD polygenic risk score. This model is designed to handle longitudinal data over the lifetime to address this unmet need and support clinical decision-making. We analyze longitudinal data from 480,638 UK Biobank participants and compared predicted lifetime risk with the 30-year Framingham risk score. MSGene improves discrimination (C-index 0.71 vs 0.66), age of high-risk detection (C-index 0.73 vs 0.52), and overall prediction (RMSE 1.1% vs 10.9%), in held-out data. We also use MSGene to refine estimates of lifetime absolute risk reduction from statin initiation. Our findings underscore our multistate model’s potential public health value for accurate lifetime CAD risk estimation using clinical factors and increasingly available genetics toward earlier more effective prevention.
... As the effect on the endpoints changed over time, which goes against the basic assumption of proportional hazards, a step function on three time intervals ([index, 1 year), [1 year, 3 years), [3 years, ∞)) was used. Thus, the problem of non-proportional hazards was avoided [11,12]. Through this method, the changing effect of the parameters potentially influencing the outcome of the valve type over time was accounted for. ...
Article
Full-text available
Background: Conflicting data exist on the occurrence and outcome of infective endocarditis (IE) after pulmonary valve implantation. Objectives: This study sought to assess the differences between transcatheter pulmonary valve implantation (TPVI) and surgical pulmonary valve replacement (SPVR). Methods: All patients ≥ 4 years who underwent isolated pulmonary valve replacement between 2005 and 2018 were analyzed based on the data of a major German health insurer (≈9.2 million insured subjects representative of the German population). The primary endpoint was a composite of IE occurrence and all-cause death. Results: Of 461 interventions (cases) in 413 patients (58.4% male, median age 18.9 years [IQR 12.3–33.4]), 34.4% underwent TPVI and 65.5% SPVR. IE was diagnosed in 8.0% of cases during a median follow-up of 3.5 years. Risk for IE and all-cause death was increased in patients with prior IE (p < 0.001), but not associated with age (p = 0.50), sex (p = 0.67) or complexity of disease (p = 0.59). While there was no difference in events over the entire observational time period (p = 0.22), the time dynamics varied between TPVI and SPVR: Within the first year, the risk for IE and all-cause death was lower after TPVI (Hazard Ratio (HR) 95% CI 0.19 (0.06–0.63; p = 0.006) but increased over time and exceeded that of SPVR in the long term (HR 10.07 (95% CI 3.41–29.76; p < 0.001). Conclusions: Patients with TPVI appear to be at lower risk for early but higher risk for late IE, resulting in no significant difference in the overall event rate compared to SPVR. The results highlight the importance of long-term specialized care and preventive measures after both interventions.
... Given the appearance of dependent covariates or coefficients with time in the Cox PH model, it is no longer a 'proportional hazards' model since the relative hazard is also time-varying. Therefore, in the violation of the PH assumption, the simple Cox regression model is not valid, and more sophisticated techniques have been utilized to deal with the non-proportionality of hazards (73)(74)(75). Mathematically, extending the simple PH model to a more general NPH model is easy. Particularly, a general expression for the hazard function, given covariate X, is: ...
Article
Full-text available
The time-to-event relationship for survival modeling is considered when designing a study in clinical trials. However, because time-to-event data are mostly not normally distributed, survival analysis uses non-parametric data processing and analysis methods, mainly Kaplan–Meier (KM) estimation models and Cox proportional hazards (CPH) regression models. At the same time, the log-rank test can be applied to compare curves from different groups. However, resorting to conventional survival analysis when fundamental assumptions, such as the Cox PH assumption, are not met can seriously affect the results, rendering them flawed. Consequently, it is necessary to examine and report more sophisticated statistical methods related to the processing of survival data, but at the same time, able to adequately respond to the contemporary real problems of clinical applications. On the other hand, the frequent misinterpretation of survival analysis methodology, combined with the fact that it is a complex statistical tool for clinicians, necessitates a better understanding of the basic principles underlying this analysis to effectively interpret medical studies in making treatment decisions. In this review, we first consider the basic models and mechanisms behind survival analysis. Then, due to common errors arising from the inappropriate application of conventional models, we revise more demanding statistical extensions of survival models related to data manipulation to avoid wrong results. By providing a structured review of the most representative statistical methods and tests covering contemporary survival analysis, we hope this review will assist in solving problems that arise in clinical applications.
... To identify the association between grandmother presence and grandchild mortality, we performed survival analyses using the counting process formulation of the Cox proportional hazard (CPH) model [41][42][43], which we performed in R version 3.6.1 [44] with the function coxme in the R package survival following Therneau et al. [45,46]. All models included maternal grandmother presence, which we categorized as a two-level variable (0−absent, 1−present) based on whether the grandmother had been confirmed alive or not. ...
Article
Full-text available
Grandmother presence can improve the number and survival of their grandchildren, but what grandmothers protect against and how they achieve it remains poorly known. Before modern medical care, infections were leading causes of childhood mortality, alleviated from the nineteenth century onwards by vaccinations, among other things. Here, we combine two individual-based datasets on the genealogy, cause-specific mortality and vaccination status of eighteenth- and nineteenth-century Finns to investigate two questions. First, we tested whether there were cause-specific benefits of grandmother presence on grandchild survival from highly lethal infections (smallpox, measles, pulmonary and diarrhoeal infections) and/or accidents. We show that grandmothers decreased all-cause mortality, an effect which was mediated through smallpox, pulmonary and diarrhoeal infections, but not via measles or accidents. Second, since grandmothers have been suggested to increase vaccination coverage, we tested whether the grandmother effect on smallpox survival was mediated through increased or earlier vaccination, but we found no evidence for such effects. Our findings that the beneficial effects of grandmothers are in part driven by increased survival from some (but not all) childhood infections, and are not mediated via vaccination, have implications for public health, societal development and human life-history evolution.
... The Kaplan-Meier survival curves were plotted to show differences in survival time, and log-rank p values reported by the Cox regression models implemented in the R package survival v3.2.11 were used to determine the statistical significance. 69 fgsea analysis ...
Article
Full-text available
EGFR-TKIs were used in NSCLC patients with actionable EGFR mutations and prolong prognosis. However, most patients treated with EGFR-TKIs developed resistance within around one year. This suggests that residual EGFR-TKIs resistant cells may eventually lead to relapse. Predicting resistance risk in patients will facilitate individualized management. Herein, we built an EGFR-TKIs resistance prediction (R-index) model and validate in cell line, mice, and cohort. We found significantly higher R-index value in resistant cell lines, mice models and relapsed patients. Patients with an elevated R-index had significantly shorter relapse time. We also found that the glycolysis pathway and the KRAS upregulation pathway were related to EGFR-TKIs resistance. MDSC is a significant immunosuppression factor in the resistant microenvironment. Our model provides an executable method for assessing patient resistance status based on transcriptional reprogramming and may contribute to the clinical translation of patient individual management and the study of unclear resistance mechanisms.
... Therefore, additional analyses were performed using Cox-Proportional Hazard (Cox model) regression (R package "survival" v.2.44-1.1, [69,70]) with a timedependent covariate (TDC) to determine if tag implantation was associated with an increased risk of mortality, and to what extent differences changed over time. The time-dependent covariate was a binomial categorization that was set to one for tagged and zero for control animals surviving to Day 32, which gave the model an explicit value to measure the significance and magnitude of a tag effect from days 32 to 61. ...
Article
Full-text available
Background Little is known about the transformer stage of the parasitic lampreys, a brief but critical period that encompasses juvenile out-migration from rivers to lakes or oceans to begin parasitic feeding. Information about this life stage could have significant conservation implications for both imperiled and invasive lampreys. We investigated tag retention, survival, wound healing, and swim performance of newly transformed sea lamprey (Petromyzon marinus) implanted with a new micro-acoustic transmitter, the eel–lamprey acoustic transmitter (ELAT), in a controlled laboratory environment. Results The 61-day survival of our tagged subjects was 71%, within the range reported in similar studies of juvenile lampreys. However, survival was significantly lower in the tagged animals (vs control), with no effect statistically attributable to measures of animal length, mass, condition, or population of origin (Great Lakes vs. Atlantic drainage). Mortality in tagged fish was concentrated in the first four days post-surgery, suggesting injury from the surgical process. An unusually long recovery time from anesthesia may have contributed to the increased mortality. In a simple burst swim assay, tagged animals swam significantly slower (− 22.5%) than untagged animals, but were not significantly different in endurance swim tests. A composite wound healing score at day four was a significant predictor of maximum burst swim speed at day 20, and wound condition was related to animal mass, but not length, at the time of tagging. Conclusions Impairments to survival and swim performance of juvenile sea lamprey implanted with the ELAT transmitter were within currently reported ranges for telemetry studies with small, difficult to observe fishes. Our results could be improved with more refined anesthesia and surgical techniques. The ability to track migratory movements of imperiled and pest populations of parasitic lampreys will improve our ability to estimate vital rates that underlie recruitment to the adult population (growth, survival) and to investigate the environmental factors that regulate the timing and rates of movement, in wild populations.
... Cox proportional hazards regression with time-varying covariates was used to estimate the association between intergenerational relationships and cognitive impairment [35]. Follow-up time was used as the timescale, which was calculated as the time from recruitment to the first occurrence of cognitive impairment or death or the last follow-up survey. ...
Article
Full-text available
Background The oldest-old (aged 80 or older) are the most rapidly growing age group, and they are more likely to suffer from cognitive impairment, leading to severe medical and economic burdens. The influence of intergenerational relationships on cognition among Chinese oldest-old adults is not clear. We aim to examine the association of intergenerational relationships with cognitive impairment among Chinese adults aged 80 or older. Methods This was a prospective cohort study, and data were obtained from the Chinese Longitudinal Healthy Longevity Survey, 14,180 participants aged 80 or older with at least one follow-up survey from 1998 to 2018. Cognitive impairment was assessed by the Chinese version of Mini Mental State Examination, and intergenerational relationships were assessed by getting main financial support from children, living with children or often being visited by children, and doing housework or childcare. We used time-varying Cox proportional hazards models to estimate hazard ratios (HRs) and 95% confidence intervals (CIs) of associations between intergenerational relationships and cognitive impairment. Results We identified 5443 incident cognitive impairments in the 24-cut-off MMSE cohort and 4778 in the 18-cut-off MMSE cohort between 1998 and 2018. After adjusting for a wide range of confounders, the HR was 2.50 (95% CI: 2.31, 2.72) in the old who received main financial support from children, compared with those who did not. The HR was 0.89 (95% CI: 0.83, 0.95) in the oldest-old who did housework or childcare, compared with those who did not. However, there were no significant associations between older adults’ cognitive impairments and whether they were living with or often visited by their children. Our findings were consistent in two different MMSE cut-off values (24 vs. 18) for cognitive impairment. Conclusions Sharing housework or childcare for children showed a protective effect on older adults’ cognitive function, whereas having children provide primary financial support could increase the risk for cognitive impairments. Our findings suggest that governments and children should pay more attention to older adults whose main financial sources from their children. Children can arrange some easy tasks for adults 80 years of age or older to prevent cognitive impairments.
... Survival analysis was performed using Cox's proportional hazard regression analysis as implement in the R package survival. 78, 66 Likelihood ratio tests were used to determine the statistical significance of the survival models. Wilcoxon rank sum tests were used to evaluate differences in immune cell frequencies and mean fluorescence intensity by flow cytometry, to calculate differentially expressed genes from the scRNaseq analysis, and identify gene modules that were statistically different between patients who survived and those that died. ...
Article
Full-text available
Despite extensive analyses, there remains an urgent need to delineate immune cell states that contribute to mortality in critically ill Coronavirus disease 2019 (COVID-19) patients. Here, we present high-dimensional profiling of blood and respiratory samples in severe COVID-19 patients to examine the association between cell-linked molecular features and mortality outcomes. Peripheral transcriptional profiles by single-cell RNAseq based deconvolution of immune states are associated with COVID-19 mortality. Further, persistently high levels of an interferon signaling module in monocytes over time leads to subsequent concerted upregulation of inflammatory cytokines. SARS-CoV-2 infected myeloid cells in the lower respiratory tract upregulate CXCL10, leading to a higher risk of death. Our analysis suggests a pivotal role for viral infected myeloid cells and protracted interferon signaling in severe COVID-19.
... Survival analysis was performed using Cox's proportional hazard regression analysis as implement in the R package survival 60,61 . Likelihood ratio tests were used to determine the statistical significance of the survival models. ...
Preprint
Full-text available
Coronavirus disease 2019 (COVID-19) caused by SARS-CoV-2 infection presents with varied clinical manifestations, ranging from mild symptoms to acute respiratory distress syndrome (ARDS) with high mortality. Despite extensive analyses, there remains an urgent need to delineate immune cell states that contribute to mortality in severe COVID-19. We performed high-dimensional cellular and molecular profiling of blood and respiratory samples from critically ill COVID-19 patients to define immune cell genomic states that are predictive of outcome in severe COVID-19 disease. Critically ill patients admitted to the intensive care unit (ICU) manifested increased frequencies of inflammatory monocytes and plasmablasts that were also associated with ARDS not due to COVID-19. Single-cell RNAseq (scRNAseq)-based deconvolution of genomic states of peripheral immune cells revealed distinct gene modules that were associated with COVID-19 outcome. Notably, monocytes exhibited bifurcated genomic states, with expression of a cytokine gene module exemplified by CCL4 (MIP-1β) associated with survival and an interferon signaling module associated with death. These gene modules were correlated with higher levels of MIP-1β and CXCL10 levels in plasma, respectively. Monocytes expressing genes reflective of these divergent modules were also detectable in endotracheal aspirates. Machine learning algorithms identified the distinctive monocyte modules as part of a multivariate peripheral immune system state that was predictive of COVID-19 mortality. Follow-up analysis of the monocyte modules on ICU day 5 was consistent with bifurcated states that correlated with distinct inflammatory cytokines. Our data suggests a pivotal role for monocytes and their specific inflammatory genomic states in contributing to mortality in life-threatening COVID-19 disease and may facilitate discovery of new diagnostics and therapeutics.
Article
Full-text available
Objectives. The goals are to develop a nonlinear risk model and examine its prediction applicability for clinical use. Methods. Methods of survival analysis and regression statistical models were used. Results. A practical approach to assessing nonlinear risks of adverse events using the example of gastric cancer treatment is proposed. A model for predicting the metachronous peritoneal dissemination in patients undergoing radical surgery for gastric cancer was proposed and studied. Assessment of risks for various periods of observation was performed, and the clinical suitability of developed approach was assessed. Conclusion. In clinical oncological practice, not only timely treatment plays an important role, but also the prevention of adverse outcomes after treatment. Individualization of patient monitoring after treatment reduces the risks of fatal outcomes and the costs of additional research and treatment in the event of cancer progression. Based on the results of this study, we propose solutions that should lead to more effective and high-quality treatment tactics and follow-up after treatment for gastric cancer, also to the selection of optimal approaches and to obtaining clinically favorable outcomes of the disease. The proposed risk prediction method will ultimately lead to individualized patient management based on the results of personal data.
Article
Full-text available
In this study, we intensively measured the longitudinal productivity and survival of 362 commercially managed honey bee colonies in Canada, over a two-year period. A full factorial experimental design was used, whereby two treatments were repeated across apiaries situated in three distinct geographic regions: Northern Alberta, Southern Alberta and Prince Edward Island, each having unique bee management strategies. In the protein supplemented treatment, colonies were continuously provided a commercial protein supplement containing 25% w/w pollen, in addition to any feed normally provided by beekeepers in that region. In the fumagillin treatment, colonies were treated with the label dose of Fumagilin-B ® each year during the fall. Neither treatment provided consistent benefits across all sites and dates. Fumagillin was associated with a large increase in honey production only at the Northern Alberta site, while protein supplementation produced an early season increase in brood production only at the Southern Alberta site. The protein supplement provided no long-lasting benefit at any site and was also associated with an increased risk of death and decreased colony size later in the study. Differences in colony survival and productivity among regions, and among colonies within beekeeping operations, were far larger than the effects of either treatment, suggesting that returns from extra feed supplements and fumagillin were highly contextually dependent. We conclude that use of fumagillin is safe and sometimes beneficial, but that beekeepers should only consider excess protein supplementation when natural forage is limiting.
Article
Full-text available
In Westminster parliamentary democracies like Australia, Canada, and New Zealand, research has found that cabinet composition is driven mainly by longstanding norms and practices that privilege older, white males with certain educational and political experiences. Do these trends apply at the subnational level where the demographic make-up can be quite different? To answer this question, we draw upon an original dataset of all members of the legislative assembly and cabinet in three Canadian territories from 1979 to 2022. These territories are unique given that Indigenous communities loom large in their governments and societies. Using an event history model, we find that territorial cabinets very much reflect the demographic make up of their legislatures, similar to what occurs at the federal level. We also observe important differences between the territories, which suggest that the influence of Westminster structures and norms are likely mediated by factors unique to each territory. Sommaire Certaines recherches effectuées sur les démocraties parlementaires de type Westminster telles que l'Australie, le Canada et la Nouvelle-Zélande, ont montré que la Can Public Admin. 2024;1-18. wileyonlinelibrary.com/journal/capa | 1 This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.
Article
Full-text available
This study attempts to identify and briefly describe the current directions in applied and theoretical clinical prediction research. Context-rich chronic heart failure syndrome (CHFS) telemedicine provides the medical foundation for this effort. In the chronic stage of heart failure, there are sudden exacerbations of syndromes with subsequent hospitalizations, which are called acute decompensation of heart failure (ADHF). These decompensations are the subject of diagnostic and prognostic predictions. The primary purpose of ADHF predictions is to clarify the current and future health status of patients and subsequently optimize therapeutic responses. We proposed a simplified discrete-state disease model as an attempt at a typical summarization of a medical subject before starting predictive modeling. The study tries also to structure the essential common characteristics of quantitative models in order to understand the issue in an application context. The last part provides an overview of prediction works in the field of CHFS. These three parts provide the reader with a comprehensive view of quantitative clinical predictive modeling in heart failure telemedicine with an emphasis on several key general aspects. The target community is medical researchers seeking to align their clinical studies with prognostic or diagnostic predictive modeling, as well as other predictive researchers. The study was written by a non-medical expert.
Article
Background Although growing evidence has shown independent links of long-term exposure to fine particulate matter (PM2.5) with cognitive impairment, the effects of its constituents remain unclear. This study aims to explore the associations of long-term exposure to ambient PM2.5 constituents’ mixture with cognitive impairment in Chinese older adults, and to further identify the main contributor. Methods 15,274 adults ≥ 65 years old were recruited by the Chinese Longitudinal Healthy Longevity Study (CLHLS) and followed up through 7 waves during 2000–2018. Concentrations of ambient PM2.5 and its constituents (i.e., black carbon [BC], organic matter [OM], ammonium [NH4+], sulfate [SO42-], and nitrate [NO3-]) were estimated by satellite retrievals and machine learning models. Quantile-based g-computation model was employed to assess the joint effects of a mixture of 5 PM2.5 constituents and their relative contributions to cognitive impairment. Analyses stratified by age group, sex, residence (urban vs. rural), and region (north vs. south) were performed to identify vulnerable populations. Results During the average 3.03 follow-up visits (89,296.9 person-years), 4294 (28.1%) participants had developed cognitive impairment. The adjusted hazard ratio [HR] (95% confidence interval [CI]) for cognitive impairment for every quartile increase in mixture exposure to 5 PM2.5 constituents was 1.08 (1.05–1.11). BC held the largest index weight (0.69) in the positive direction in the qg-computation model, followed by OM (0.31). Subgroup analyses suggested stronger associations in younger old adults and rural residents. Conclusion Long-term exposure to ambient PM2.5, particularly its constituents BC and OM, is associated with an elevated risk of cognitive impairment onset among Chinese older adults.
Preprint
Full-text available
The current demand for early intervention, prevention, and treatment of late onset Alzheimer’s disease (LOAD) warrants deeper understanding of the underlying molecular processes which could contribute to biomarker and drug target discovery. Utilizing high-throughput proteomic measurements in serum from a prospective population-based cohort of older adults (n = 5,294), we identified 303 unique proteins associated with incident LOAD (median follow-up 12.8 years). Over 40% of these proteins were associated with LOAD independently of APOE- ε 4 carrier status. These proteins were implicated in neuronal processes and overlapped with protein signatures of LOAD in brain and cerebrospinal fluid. We found 17 proteins which LOAD-association was strongly dependent on APOE- ε 4 carrier status. Most of them showed consistent associations with LOAD in cerebrospinal fluid and a third had brain-specific gene expression. Remarkably, four proteins in this group (TBCA, ARL2, S100A13 and IRF6) were downregulated by APOE- ε 4 yet upregulated as a consequence of LOAD as determined in a bi-directional Mendelian randomization analysis, reflecting a potential response to the disease onset. Accordingly, the direct association of these proteins to LOAD was reversed upon APOE- ε 4 genotype adjustment, a finding which we replicate in an external cohort (n = 719). Our findings provide an insight into the dysregulated pathways that may lead to the development and early detection of LOAD, including those both independent and dependent on APOE- ε 4 . Importantly, many of the LOAD-associated proteins we find in the circulation have been found to be expressed - and have a direct link with AD - in brain tissue. Thus, the proteins identified here, and their upstream modulating pathways, provide a new source of circulating biomarker and therapeutic target candidates for LOAD.
Article
Background This study evaluates the relationship between smoking, alcohol, and breast cancer outcomes according to molecular subtype. Methods This population-based prospective cohort consisted of 3,876 women ages 20 to 69 diagnosed with a first primary invasive breast cancer from 2004 to 2015 in the Seattle–Puget Sound region. Breast cancer was categorized into three subtypes based on estrogen receptor (ER), progesterone receptor (PR), and HER2 expressions: luminal (ER+), triple-negative (TN; ER−/PR−/HER2−), and HER2-overexpressing (H2E; ER−/HER2+). We fit Cox proportional hazards models to assess the association between alcohol consumption and smoking status at diagnosis and risks of recurrence, breast cancer–specific mortality, and all-cause mortality. Results Histories of ever smoking [HR, 1.33; 95% confidence interval (CI), 1.01–1.74] and current smoking (HR, 1.59; 95% CI, 1.07–2.35) were associated with greater risk of breast cancer recurrence among TN cases. Smoking was also associated with greater risk of recurrence to bone among all cases and among luminal cases. Elevated risks of breast cancer–specific and all-cause mortality were observed among current smokers across all subtypes. Alcohol use was not positively associated with risk of recurrence or mortality overall; however, TN patients who drank four or more drinks per week had a decreased risk of recurrence (HR, 0.71; 95% CI, 0.51–0.98) and breast cancer–specific mortality (HR, 0.73; 95% CI, 0.55–0.97) compared with non-current drinkers. Conclusions Patients with breast cancer with a history of smoking at diagnosis have elevated risks of recurrence and mortality. Impact These findings underscore the need to prioritize smoking cessation among women diagnosed with breast cancer.
Article
Full-text available
Objectives Oral contraceptives (OC) and menopausal hormone therapy (MHT) contain exogenous sex hormones and are used by millions of women around the world. However, their effect on the development of rheumatoid arthritis (RA) is still debated and the current literature suggests that they may exert opposite effects on the risk of RA. The present study aimed to estimate the effects of exogenous hormones on the development of RA, both during the reproductive lifespan and later in life. Methods The association between OC and RA, as well as between MHT and late-onset RA (LORA), was investigated using time-dependent Cox regression modelling in white British women from the UK Biobank (n = 236 602 and n = 102 466, respectively) and replicated in women from all ethnic groups. Results OC use was associated with a decreased risk of RA in ever-users [hazard ratio (HR) = 0.89; 95% CI = 0.82–0.96], as well as in current (HR = 0.81; 0.73–0.91) and former users (HR = 0.92; 0.84 –1.00), compared with never-users. In contrast, MHT use was associated with an increased risk of LORA in ever-users (HR = 1.16; 1.06–1.26) as well as in former users (HR = 1.13; 1.03–1.24) compared with never-users. Conclusion OC use appears to protect against RA, while MHT may increase the risk of LORA. This study provides new insights into the possible inverse effect of exposure to different exogenous sex hormones on the risk of RA.
Article
Immune-targeted therapies have efficacy for treatment of autoinflammatory diseases. For example, treatment with the T cell–specific anti-CD3 antibody teplizumab delayed disease onset in participants at high risk for type 1 diabetes (T1D) in the TrialNet 10 (TN-10) trial. However, heterogeneity in therapeutic responses in TN-10 and other immunotherapy trials identifies gaps in understanding disease progression and treatment responses. The intestinal microbiome is a potential source of biomarkers associated with future T1D diagnosis and responses to immunotherapy. We previously reported that antibody responses to gut commensal bacteria were associated with T1D diagnosis, suggesting that certain antimicrobial immune responses may help predict disease onset. Here, we investigated anticommensal antibody (ACAb) responses against a panel of taxonomically diverse intestinal bacteria species in sera from TN-10 participants before and after teplizumab or placebo treatment. We identified IgG2 responses to three species that were associated with time to T1D diagnosis and with teplizumab treatment responses that delayed disease onset. These antibody responses link human intestinal bacteria with T1D progression, adding predictive value to known T1D risk factors. ACAb analysis provides a new approach to elucidate heterogeneity in responses to immunotherapy and identify individuals who may benefit from teplizumab, recently approved by the U.S. Food and Drug Administration for delaying T1D onset.
Article
Full-text available
This paper addresses the problem of learning temporal graph representations, which capture the changing nature of complex evolving networks. Existing approaches mainly focus on adding new nodes and edges to capture dynamic graph structures. However, to achieve more accurate representation of graph evolution, we consider both the addition and deletion of nodes and edges as events. These events occur at irregular time scales and are modeled using temporal point processes. Our goal is to learn the conditional intensity function of the temporal point process to investigate the influence of deletion events on node representation learning for link-level prediction. We incorporate network entropy, a measure of node and edge significance, to capture the effect of node deletion and edge removal in our framework. Additionally, we leveraged the characteristics of a generalized temporal Hawkes process, which considers the inhibitory effects of events where past occurrences can reduce future intensity. This framework enables dynamic representation learning by effectively modeling both addition and deletion events in the temporal graph. To evaluate our approach, we utilize autonomous system graphs, a family of inhomogeneous sparse graphs with instances of node and edge additions and deletions, in a link prediction task. By integrating these enhancements into our framework, we improve the accuracy of dynamic link prediction and enable better understanding of the dynamic evolution of complex networks.
Article
Full-text available
Effectively addressing grand societal challenges like climate change and environmental degradation requires policy intervention that is not only continuous but also increasing in ambition over time. However, negative feedback could lead to policies being weakened or even discontinued after a while. An important but unresolved policy question, therefore, is whether policies can be deliberately designed to survive (i.e., to “stick”) and, ideally, be replaced with more ambitious ones over time (i.e., to “ratchet up”). We bridge policy feedback and policy design scholarship to derive hypotheses on the effects of two policy design features—“intensity” (i.e., a measure of policies’ overall design) and “specifcity” (i.e., a measure of policies’ targeted focus)—on policy (dis-)continuation and ratcheting-up (-down) of ambition. Focusing on policy design, we contribute to the theorization and empirical understanding of endogenous factors behind policy change. We test our hypotheses with an event history dataset of 627 low-carbon energy policies in eight developed countries. Conducting a multilevel survival analysis, we find statistically signifcant evidence of more intense policies being replaced with less intense ones, i.e., more intense policies lead to ratcheting-down of ambition. We also fnd that more specifc policies are more likely to be replaced with more intense policies, i.e., more specifc policies lead to ratcheting-up of ambition. Based on these novel insights, we discuss how policy design can navigate these complex dynamics. In this sense, our approach also contributes to the discussion about the “forward-looking” potential of the policy sciences.
Article
Introduction: The health effects of ambient ozone have been investigated in many previous studies. However, the effects of long-term exposure to ambient ozone on the incidence of cardiovascular disease (CVD) remain inconclusive. Objectives: To estimate the associations of long-term exposure to maximum daily 8-hours average ozone (MDA8 O3) with the incidence of total CVD, heart disease, hypertension, and stroke. Methods: This was a prospective cohort study, and the data was obtained from the China Health and Retirement Longitudinal Survey (CHARLS) implemented during 2011-2018 and the China Family Panel Studies (CFPS) implemented during 2010-2018. We applied a Cox proportional hazards regression model to evaluate the associations of MDA8 O3 with total CVD, heart disease, hypertension, and stroke risks, and the corresponding population-attributable fractions (PAF) attributable to MDA8 O3 were also calculated. All analyses were conducted by R software. Results: The mean MDA8 O3 concertation of all included participants in the CHARLS and CFPS were 51.03 part per billion (ppb) and 51.15 ppb, respectively. In the CHARLS including 18,177 participants, each 10ppb increment in MDA8 O3 concentration was associated with a 31% increase [hazard ratio (HR)= 1.31, 95% confidence interval (CI): 1.22-1.42] in the risk of incident heart disease, and the corresponding population-attributable fractions (PAF) was 13.79% [10.12%-17.32%]. In the CFPS including 30,226 participants, each 10ppb increment in MDA8 O3 concentration was associated with an increase in the risk of incident total CVD (1.07 [1.02-1.13]), and hypertension (1.10 [1.03-1.18]). The PAFs of total CVD, and hypertension attributable to MDA8 O3 were 3.53% [0.82%-6.16%], and 5.11% [1.73%-8.38%], respectively. Stratified analyses showed greater associations in males, urban areas, and Southern China. Conclusions: Long-term exposure to MDA8 O3 may increase the incidence of CVD. Therefore, the policies that control O3 and related precursors are persistently needed.
Article
Full-text available
Background Survival in cancer patients is associated with a multitude of biological, social, and psychological factors. Although it is well established that all these factors add to overall mortality, it is not well understood how the predictive power of these parameters changes in a comprehensive model and over time. Methods Patients who attended the authors’ outpatient clinic were invited to participate. The authors followed 5180 mixed cancer patients (51.1% female; mean age, 59.1 years [SD = 13.8]) for up to 16 years and analyzed biological (age, sex, cancer site, anemia), psychological (anxiety, depression), and social variables (marital status, education, employment status) potentially predicting overall survival in a Cox proportional hazards model. Results The median survival time for the entire sample was 4.3 years (95% confidence interval, 4.0–4.7). The overall survival probabilities for 1 and 10 years were 76.8% and 38.0%, respectively. Following an empirical approach, the authors split the time interval into five periods: acute, subacute, short‐term, medium‐term, and long‐term. A complex pattern of variables predicted overall survival differently in the five periods. Biological parameters were important throughout most of the time, social parameters were either time‐independent predictors or tended to be more important in the longer term. Of the psychological parameters, only depression was a significant predictor and lost its predictive power in the long‐term. Conclusions The findings of this study allow the development of comprehensive patient‐specific models of risk and resilience factors addressing biopsychosocial needs of cancer patients, paving the way for a personalized treatment plan that goes beyond biomedical cancer care.
Article
Full-text available
Increasing mixed chimerism (reemerging recipient cells) after allogeneic hematopoietic cell transplantation (allo-HCT) can indicate relapse, the leading factor determining mortality in blood malignancies. Most clinical chimerism tests have limited sensitivity and are primarily designed to monitor engraftment. We developed a panel of qPCR assays using TaqMan chemistry capable of quantifying chimerism on the order of 1-in-a-million. At such analytic sensitivity, we hypothesized it could inform on relapse risk. As a proof-of-concept, we applied our panel on a retrospective cohort of acute leukemia patients with known outcomes post-allo-HCT. Recipient cells in bone marrow aspirates (BMA) remained detectable in 97.8% of tested samples. Absolute recipient chimerism proportions and rates at which these proportions increased in BMA in the first 540 days post-allo-HCT were associated with relapse. Detectable MRD (measurable residual disease) by flow cytometry in BMA post-allo-HCT showed limited correlation with relapse. This correlation noticeably strengthened when combined with increased recipient chimerism in BMA, demonstrating the ability of our ultrasensitive chimerism assay to augment MRD data. Our technology reveals an underappreciated usefulness of clinical chimerism. Used side-by-side with MRD assays, it promises to improve identification of patients with the highest risk of disease reoccurrence for a chance for early intervention.
Preprint
Full-text available
BACKGROUND The Dutch Committee for the Evaluation of Oncological Drugs evaluates the effectiveness of new oncological drugs. The committee compares survival endpoints to the so-called PASKWIL-2023 criteria for palliative treatments. A positive recommendation depends on whether the median overall survival (OS) is below or above 12 months in the comparator arm. If the former applies, an OS benefit of at least 12 weeks, and a hazard ratio (HR) smaller than 0.7 are required. If the latter applies, an OS or progression free survival (PFS) benefit of at least 16 weeks, and an HR smaller than 0.7 are required. Nonetheless, the median survival time may not be reached and the proportional hazards (PH) assumption, quantified by the HR, is likely violated for IO therapies, deeming these criteria inappropriate. METHODS We conducted a systematic literature review to identify statistical methods used to represent the clinical effectiveness of IO therapies based on trial data. We searched MEDLINE and EMBASE databases from inception to August 31, 2022, limited to English papers. Methodological studies, randomized controlled trials, and discussion papers recognising key issues of survival data analysis of IO therapies were eligible for inclusion. RESULTS A total of 1,035 unique references were identified. After full paper screening, 17 publications were included in the review. Additionally, 43 papers were identified through ‘snowballing’. We conclude the current PASKWIL-2023 criteria are methodologically incorrect under non-PH. In that case, single summary statistics fail to capture the treatment effect and any measure should be interpreted in combination with the Kaplan-Meier curves. We recommend ’parameter-free’ measures, such as the difference in restricted mean survival time, avoiding assumptions on the underlying survival. CONCLUSIONS The HR is commonly used to assess treatment effectiveness, without investigating the validity of the PH assumption. This happens with the application of the PASKWIL-2023 criteria for palliative oncology treatments, which can only be valid under a PH setting. Under non-PH, alternative treatment effect measures are suggested. We propose a step-by-step approach supporting the choice of the most appropriate methods to quantify treatment effectiveness that can be used to redefine the PASKWIL-2023 criteria, or similar criteria in other clinical areas.
Article
Full-text available
Newborn piglets have a high risk of being crushed by the sow, and this risk implies welfare and economic consequences. The aim of this study was to investigate the importance of differentiating between low viable (secondary crushing losses) and viable crushed (primary crushing losses) piglets for the evaluation of risk factors for crushing related to characteristics of the sow, the litter, and the environment. Eleven Swiss farmers recorded sows’ production data (parity class, gestation length, numbers of live-born and stillborn piglets), data (age, sex, weight, cause of death, and signs of weakness) for every live-born piglet that died in the first week after birth (piglet loss), and ambient temperature. Piglet losses were assigned to five categorical events: piglet loss, subdivided into not crushed and crushed, the latter being further subdivided into low viable crushed and viable crushed. Piglets recorded by the farmer as crushed were assigned to the events low viable crushed and viable crushed based on the piglet’s body weight and signs of weakness (diseases, malformations). Data of 9,543 live-born piglets from 740 litters were eventually used to statistically model the hazard of dying at any given time in the first week after birth due to one of these events (mixed-effects Cox model). Five potential risk factors were analyzed as co-variates: parity class, gestation length, number of live-born piglets, number of stillborn piglets, and daily number of hours with ambient temperature >30°C. We identified two risk factors for dying from the event viable crushed that were not identified as risk factors for low viable crushed, namely shorter gestation length and higher daily number of hours with ambient temperature > 30°C. Vice-versa, we identified additional live-born piglets in the litter as risk factor for low viable crushed, but not for viable crushed. Our results show the importance of differentiating between low viable and viable crushed piglets for the interpretation of risk factors for crushing losses. Therefore, we suggest that for breeding purposes and in research, this differentiation should be made.
Article
Full-text available
Research regarding the effects of automation on labor supply often assesses the labor force as a whole and disregards specific effects on aging workers. In light of rapid technological changes in the labor market, we assess the linkage between the automatability of aging workers and their retirement decisions. Based on the theoretical model of task-based technological changes and drawing data from the Health and Retirement Study and O*NET 2000–2018, we create an automatability index based on workers’ primary skills. Using the index as our main explanatory variable in Cox proportional hazards models and logit models, we find that skill-specific automatability increases the retirement likelihood, both in terms of their expected and actual timing of retirement. This work provides empirical evidence that individuals’ automatability renders the notion of “working at old age” less viable, despite the financial and health benefits of staying in the labor force for an extended period. Our findings offer important insights on how to better promote productive aging, for instance, by offering retraining programs for older workers to harness their soft skills to reduce automatability in the labor market.
Article
Actors rarely approach institutional design choices with a blank slate but are influenced by design choices made at earlier stages. How does institutional design evolve over time and are there specific paths to deepening cooperation? We investigate the institutional design paths of subnational cooperation that are chosen to address increasingly complex and interconnected policy problems. We theorize that besides the substantive problem, earlier choices matter to explain what institutional design mechanism is chosen; that is, the design of existing institutions between two subnational governance units, called substates, influences the design of subsequent institutions. Using a semi‐parametric Cox proportional hazards model, we show that the design paths of subnational cooperation in the Swiss water governance sector correlate with earlier design choices. Our results indicate that not all cooperation is self‐reinforcing and path‐dependent, but they show which specific design choices are more likely to follow each other in repeated formal federal intergovernmental cooperation.
Article
Background: Lynch Syndrome (LS) screening guidelines originally recommended colonoscopy every 1 to 2 years, beginning between the ages of 20-25 years. Recent studies have questioned the benefits of these short screening intervals in preventing colorectal cancer (CRC). Our goal is to determine how colonoscopy screening intervals impact CRC in patients with LS. Methods: We analyzed the demographics, screening practices and outcomes of patients with LS identified through the clinic based Familial Gastrointestinal Cancer Registry at the Zane Cohen Centre, Sinai Health System, Toronto. Results: A total of 429 patients with LS were identified with median follow-up of 9.2 years, 44 developed CRC. We found a positive trend between shorter screening intervals and the number of adenomas detected during colonoscopy. Any new adenoma detected at screening decreased 10-year CRC incidence by 11.3%. For MLH1 carriers, a screening interval of 1-2 years vs. 2-3 years led to a 20-year cumulative CRC risk reduction of 28% and 14% in females and males. For MSH2 carriers, this risk reduction was 29% and 17%, respectively, and for male MSH6 carriers 18%. Individuals without any adenomas detected (53.4% of LS carriers) had an increased 20-year CRC risk of 25.7% and 57.2% for women and men, respectively, compared to those diagnosed with adenomas at screening. Conclusions: The recommended colonoscopy screening interval of 1-2 year is efficient at detecting adenomas and reducing CRC risk. The observation that 53.4% of LS patients never had an adenoma warrants further investigation about a possible adenoma-free pathway.
Article
Full-text available
Among the emerging and reemerging arboviral diseases, Zika, dengue and chikungunya deserve special attention due to their wide geographical distribution and clinical severity. The three arboviruses are transmitted by the same vector and can present similar clinical syndromes, bringing challenges to their identification and register. Demographic characteristics and individual and contextual social factors have been associated with the three arboviral diseases. However, little is known about such associations among adolescents, whose relationships with the social environment are different from those of adult populations, implying potentially different places, types, and degrees of exposure to the vector, particularly in the school context. This study aims to identify sociodemographic and environmental risk factors for the occurrence of Zika, dengue, and chikungunya in a cohort of adolescents from the Study of Cardiovascular Risks in Adolescents-ERICA-in the cities of Rio de Janeiro/RJ and Fortaleza/CE, from January 2015 to March 2019. Cases were defined as adolescents with laboratory or clinical-epidemiological diagnosis of Zika, dengue, or chikungunya, notified and registered in the Information System for Notifiable Diseases (SINAN). The cases were identified by linkage between the databases of the ERICA cohort and of SINAN. Multilevel Cox regression was employed to estimate hazard ratios (HR) as measures of association and respective 95% confidence intervals (95%CI). In comparison with adolescents living in lower socioeconomic conditions, the risk of becoming ill due to any of the three studied arboviral diseases was lower among those living in better socioeconomic conditions (HR = 0.43; 95%CI: 0.19-0.99; p = 0.047) and in the adolescents who attended school in the afternoon period (HR = 0.17; 95%CI: 0.06-0.47; p
Article
Pharmaceutical claims data are often used as the primary information source to define medicine exposure periods in pharmacoepidemiological studies. However, often critical information on directions for use and the intended duration of medicine supply are not available. In the absence of this information, alternative approaches are needed to support the assignment of exposure periods. This study summarises the key methods commonly used to estimate medicine exposure periods and dose from pharmaceutical claims data; and describes a method using individualised dispensing patterns (IDP) to define time-dependent estimates of medicine exposure and dose. This method extends on important features of existing methods and also accounts for recent changes in an individual's medicine use. Specifically, this method constructs medicine exposure periods and estimates the dose used by considering characteristics from an individual's prior dispensings, accounting for the time between prior dispensings and the amount supplied at prior dispensings. Guidance on the practical applications of this method is also provided. Although developed primarily for application to databases which do not contain duration of supply or dose information, use of this method may also facilitate investigations when such information is available and there is a need to consider individualised and/or changing dosing regimens. By shifting the reliance on prescribed duration and dose to determine exposure and dose estimates, individualised dispensing information is used to estimate patterns of exposure and dose for an individual. Reflecting real-world individualised use of medicines with complex and variable dosing regimens, this method offers a pragmatic approach that can be applied to all medicine classes.
Article
Background and aims: Opioid overdose is a leading cause of death during the immediate time after release from jail or prison. Most jails in the United States do not provide methadone and buprenorphine treatment for opioid use disorder (MOUD), and research in estimating its impact in jail settings is limited. We aimed to test the hypothesis that in-jail MOUD is associated with lower overdose mortality risk post-release. Design, setting and participants: Retrospective, observational cohort study of 15 797 adults with opioid use disorder who were released from New York City jails to the community in 2011-17. They experienced 31 382 incarcerations and were followed up to 1 year. Measurements: The primary outcomes were death caused by accidental drug poisoning and all-cause death. The exposure was receipt of MOUD (17 119 events) versus out-of-treatment (14 263 events) during the last 3 days before community reentry. Covariates included demographic, clinical, behavioral, housing, healthcare utilization, and legal characteristics variables. We performed multivariable, mixed-effect Cox regression analysis to test association between in-jail MOUD and deaths. Findings: A majority were male (82%) and their average age was 42 years. Receiving MOUD was associated with misdemeanor charges, being female, injection drug use, and homelessness. During 1 year post-release, 111 overdose deaths occurred, and crude death rates were 0.49 and 0.83 per 100 person-years for in-jail MOUD and out-of-treatment groups, respectively. Accounting for confounding and random effects, in-jail MOUD was associated with lower overdose mortality risk (adjusted hazard ratio = 0.20, 95% CI = 0.08-0.46), and all-cause mortality risk (adjusted hazard ratio = 0.22, 95% CI = 0.11-0.42) for the first month post-release. Conclusions: Methadone and buprenorphine treatment for opioid use disorder during incarceration was associated with an 80% reduction in overdose mortality risk for the first month post-release.
Article
Full-text available
In clinical or epidemiological follow‐up studies, methods based on time scale indicators such as the restricted mean survival time (RMST) have been developed to some extent. Compared with traditional hazard rate indicator system methods, the RMST is easier to interpret and does not require the proportional hazard assumption. To date, regression models based on the RMST are indirect or direct models of the RMST and baseline covariates. However, time‐dependent covariates are becoming increasingly common in follow‐up studies. Based on the inverse probability of censoring weighting (IPCW) method, we developed a regression model of the RMST and time‐dependent covariates. Through Monte Carlo simulation, we verified the estimation performance of the regression parameters of the proposed model. Compared with the time‐dependent Cox model and the fixed (baseline) covariate RMST model, the time‐dependent RMST model has a better prediction ability. Finally, an example of heart transplantation was used to verify the above conclusions.
Article
BROCADE3 is a Phase 3 study, evaluating veliparib in combination with carboplatin/paclitaxel with continuation as monotherapy if carboplatin/paclitaxel is discontinued in patients with germline BRCA1/2 mutation‐associated, advanced HER2‐negative breast cancer. The objective of the current analysis was to characterize the veliparib exposure‐response relationships for efficacy (progression‐free survival; PFS) and safety in this study. Exposure‐efficacy analyses of PFS were conducted using Kaplan‐Meier plots and Cox Proportional Hazards (CPH) models using treatment alone or both treatment and exposure as time‐dependent predictors to estimate the effect of veliparib in combination with carboplatin/paclitaxel and as monotherapy. The CPH model with only treatment as the time‐varying predictor estimated a statistically significant benefit of veliparib monotherapy compared to placebo monotherapy (HR = 0.49 [95% CI, 0.33‐0.73]) and a modest, non‐statistically significant benefit (HR = 0.81 [95% CI, 0.62‐1.05]) of adding veliparib to carboplatin/paclitaxel. Inclusion of exposure as an additional time‐varying predictor in the CPH model indicated a flat exposure‐response relationship between the veliparib exposure and PFS when veliparib was administered in combination with carboplatin/paclitaxel or as monotherapy. The exposure‐safety analysis did not reveal any meaningful exposure‐dependent trend in the incidence of adverse events of interest. These analyses support the dose regimen of veliparib (120 mg twice daily) in combination with carboplatin/paclitaxel and continuation of veliparib (300 to 400 mg twice daily) as monotherapy if carboplatin/paclitaxel were discontinued before disease progression in this patient population. This study is registered with ClinicalTrials.gov with a registration ID: NCT02163694. This article is protected by copyright. All rights reserved
Article
Introduction : The RECOURSE trial demonstrated a modest benefit in overall survival (OS) for trifluridine/tipiracil (FTD/TPI) versus placebo in pretreated metastatic colorectal cancer (mCRC) patients. Unfortunately, quality of life (QoL) was not assessed. We evaluated QoL and survival of patients treated with FTD/TPI in daily practice. Methods : QUALITAS is a substudy of the Prospective Dutch CRC cohort (PLCRC). From 150 mCRC patients treated with FTD/TPI, QoL (EORTC QLQ-C30 and QLQ-CR29) was assessed monthly from study entry and linked to clinical data of the Netherlands Cancer Registry. Joint models were constructed combining mixed effects models with Cox PH models. Primary endpoint was difference in QoL over time (which was deemed clinically relevant if ≥10 points). Secondary endpoints were progression-free survival (PFS), time to treatment failure (TTF), and OS. We analyzed the association between QLQ-C30 Summary Score (QoL-SS) at FTD/TPI initiation (baseline) and survival. Results : There was no clinically relevant change in QoL-SS from baseline to 10 months post-baseline (i.e. the cut-off point after which 90% of patients had discontinued FTD/TPI treatment): -5.3 [95% CI -8.7;-1.5]. Patients who were treated with FTD/TPI for ≥3 months (n=85) reported 6.3 [1.6;11.1] points higher baseline QoL, compared to patients treated <3 months (n=65, ‘poor responders’). In the latter, time to a clinically relevant QoL deterioration was <2 months. Median PFS, TTF and OS were 2.9 [2.7;3.1], 3.1 [2.9;3.2] and 7.7 [6.6;8.8] months, respectively. Worse baseline QoL-SS was independently associated with shorter OS (HR 0.45 [0.32;0.63]), PFS (0.63 [0.48;0.83]), and TTF (0.64 [0.47;0.86]). Conclusions : The maintenance of QoL during FTD/TPI treatment in daily practice supports its use. The QoL deterioration in ‘poor responders’ is likely due to disease progression. The strong association between worse baseline QoL and shorter survival suggests that clinicians should take QoL into account when determining prognosis and treatment strategy for individual patients.
Article
Neonicotinoids are a class of insecticide with global impacts on natural environments. Due to their high solubility, they are frequently found in stream ecosystems where they have the potential to impact non-target biota. While environmental concentrations are generally below lethal levels for most organisms, there are concerns that sublethal exposures can impact aquatic insects, particularly mayflies, which are highly sensitive to neonicotinoids. Because sublethal doses of neonicotinoids can reduce mobility in mayflies, exposure could indirectly increase mortality due to predation by impairing their ability to avoid initial detection or escape predators. We examined whether exposure to the neonicotinoid clothianidin at a concentration below the 96-h EC50 (7.5 µg/L), would increase the predation risk of Stenacron and Stenonema mayfly nymphs by larval southern two-lined salamanders (Eurycea cirrigera) or eastern dobsonfly nymphs (Corydalus cornutus) using a controlled laboratory experiment. For Stenacron, we found significant interactive effects between pesticide and dobsonfly exposure that increased the hazard ratio (HR). The HR assesses risk relative to a control population, in this case mayflies in similar experimental conditions but without exposure to neonicotinoids or predators. With the addition of clothianidin, the HR of mayflies exposed to a dobsonfly nymph significantly increased from 1.8 to 6.2 while the HR for those exposed to salamanders increased from 7.6 to 12.5. For Stenonema, the HR initially decreased due to dobsonfly exposure (1 to 0.3) but increased when clothianidin and dobsonflies were combined (0.3 to 1.6). Our study shows that aquatic exposure to clothianidin can increase mortality for aquatic insects through indirect pathways. Such indirect effects associated with neonicotinoid exposure warrant further investigation to expand our understanding of pesticide impacts to aquatic systems.
Article
Background Medications for opioid use disorder (MOUDs), including methadone, buprenorphine and naltrexone, are associated with lower death rates and improved quality of life for people in recovery from opioid use disorder (OUD). Less is known about each MOUD modality's association with treatment retention and the contribution of behavioral health therapy (BHT). The objectives of the current study were to estimate the association between MOUD type and treatment retention and determine whether BHT was associated with length of time retained. Methods We investigated the time from initiation to discontinuation from MOUD by medication type and exposure to BHT using statewide Medicaid Claims data (N = 81,752). We estimated covariate adjusted hazard ratios (AHR) using a Cox proportional hazards model. Results Compared to methadone, buprenorphine was associated with a higher risk of discontinuation at the time of initiation (AHR = 2.41, 95% CI = 2.28–2.55), however that difference decreased over one year of maintained retention (AHR = 1.44, 95% CI = 1.37–1.50). Compared to methadone and buprenorphine, naltrexone was associated with a higher risk of discontinuation at the time of initiation (naltrexone vs. methadone AHR = 2.49, 95% CI = 2.30–2.65; naltrexone vs. buprenorphine AHR 1.03, 95% CI = 1.00–1.07), and that relative risk increased over the course of one year of retention (naltrexone vs. methadone AHR = 3.85, 95% CI = 3.63–4.09; naltrexone vs. buprenorphine AHR = 2.67, 95% CI = 2.54–2.81). In general, independent of MOUD type, exposure to BHT during MOUD treatment was associated with a lower risk of discontinuation (AHR = 0.94, 95% CI = 0.92–0.96). However, BHT during the treatment episode was not associated with retention in the adolescent/young adult and pregnant women subpopulations. Discussion From the standpoint of early success, methadone was associated with the lowest risk of treatment discontinuation. While buprenorphine and naltrexone were associated with similar risks at the beginning of treatment, the relative discontinuation risk for buprenorphine was less than half that of naltrexone at one year of retention. In general, BHT with MOUD was associated with a lower risk of treatment discontinuation.
Article
We evaluate, in addition to seeking higher stock performance from acquisitions, whether firm managers also seek predictable stock performance. Building on prior research, we argue that higher volatility in a firm's stock performance following an acquisition is associated with divestment of a prior acquisition. Survival analysis of a longitudinal, panel dataset of 738 matched U.S. acquisitions with 9,973 firm-year observations shows that higher volatility in an acquiring firm's stock performance, following an acquisition, significantly predicts divestment of a prior acquisition. This effect is independent and stronger than the effect of the mean stock performance of an acquiring firm following an acquisition. We also find moderating effects for acquisition relatedness and acquiring firm stock performance after an acquisition. When an acquired unit is related to the parent firm's operations, it will be more likely to be divested if subsequent stock performance displays higher volatility. Still, higher overall stock performance makes a parent firm less likely to divest an acquired unit, suggesting that larger mean returns override concerns about higher stock performance volatility during corporate restructuring.
Article
Full-text available
Objective To investigate surgical complications related to the staging procedure for endometrial cancer (EC) and to explore complication associations towards patient characteristics and survival. Methods A population-based cohort study of women diagnosed with EC where primary surgery was performed at a tertiary centre between 2012 and 2016. The Swedish Quality Registry for Gynecological Cancer was used for identification, medical records reviewed and surgical outcomes, including complications according to Clavien-Dindo (CD), and comorbidity (Charlson's index) registered. Uni- and multivariable logistic regression analyses were performed with complications as outcome and multivariable Cox regression analysis with overall survival (OS) as endpoint. Results In total 549 women were identified where 108 (19.7%) had CD grade II-V complications. In the multivariable regression analysis; surgical technique, BMI and lymph node dissection, but not comorbidity or age, were found to be risk factors for complications CD grade II-V, with OR of 0.32 (95%CI:0.18–0.56) for minimalinvasive surgery (MIS) compared to open, OR 2.18 (95%CI:1.37–3.49) for BMI ≥30 and OR 2.63 (95%CI:1.32–5.31) for pelvic and paraaortic lymphnode dissection. In Cox regression analysis, a significant lower OS was found within the first 1.5 years for the cohort of complications (CD II-V) compared to no complications. Conclusion Surgical staging with lymphadenectomy was found a risk factor for complications together with high BMI in EC. Using MIS was significantly associated with less complications. Overall survival was found to be negatively affected within the first years after complications. Our results may be taken into consideration when performing updated treatment guidelines including surgical staging.
Article
Randomized vaccine trials are used to assess vaccine efficacy (VE) and to characterize the durability of vaccine‐induced protection. If efficacy is demonstrated, the treatment of placebo volunteers becomes an issue. For COVID‐19 vaccine trials, there is broad consensus that placebo volunteers should be offered a vaccine once efficacy has been established. This will likely lead to most placebo volunteers crossing over to the vaccine arm, thus complicating the assessment of long term durability. We show how to analyze durability following placebo crossover and demonstrate that the VE profile that would be observed in a placebo controlled trial is recoverable in a trial with placebo crossover. This result holds no matter when the crossover occurs and with no assumptions about the form of the efficacy profile. We only require that the VE profile applies to the newly vaccinated irrespective of the timing of vaccination. We develop different methods to estimate efficacy within the context of a proportional hazards regression model and explore via simulation the implications of placebo crossover for estimation of VE under different efficacy dynamics and study designs. We apply our methods to simulated COVID‐19 vaccine trials with durable and waning VE and a total follow‐up of 2 years.
Article
Full-text available
Objectives We aim to compare differences in mortality risk factors between admission and follow-up incorporated models. Methods A retrospective cohort study with 524 patients with confirmed COVID-19 infection admitted to a tertiary medical center in São Paulo, Brazil, from March 13th to April 30th, 2020. We collected data at admission, 3rd, 8th and 14th day of hospitalization. We calculated the hazard ratio (HR) and compared 28-day in-hospital mortality risk factors between admission and follow-up models using a time-dependent Cox regression model. Results Of 524 patients, 50.4% needed mechanical ventilation. The 28-day mortality rate was 32.8%. Compared to follow-up, admission models under-estimated the mortality HR for peripheral oxygen saturation<92% (1.21 versus 2.09), heart rate>100bpm (1.19 versus 2.04), respiratory rate>24ipm (1.01 versus 1.82) and mechanical ventilation (1.92 versus 12.93). Low oxygen saturation, higher oxygen support and more biomarkers including lactate dehydrogenase, c-reactive protein, neutrophil-lymphocyte ratio and urea remained associated with mortality after adjustment for clinical factors at follow-up compared to only urea and oxygen support at admission. Conclusions The inclusion of follow-up measurements changed mortality hazards of clinical signs and biomarkers. Low oxygen saturation, higher oxygen support, lactate dehydrogenase, c-reactive protein, neutrophil-lymphocyte ratio and urea could help prognose patients during follow-up.
Article
It is hypothesized that the systemic inflammation associated with rheumatoid arthritis (RA) promotes an increased risk of cardiovascular (CV) morbidity and mortality. We examined the risk and determinants of congestive heart failure (CHF) in patients with RA. We assembled a population-based, retrospective incidence cohort from among all individuals living in Rochester, Minnesota, in whom RA (defined according to the American College of Rheumatology 1987 criteria) was first diagnosed between 1955 and 1995, and an age- and sex-matched non-RA cohort. After excluding patients in whom CHF occurred before the RA index date, all subjects were followed up until either death, incident CHF (defined according to the Framingham Heart Study criteria), migration from the county, or until January 1, 2001. Detailed information from the complete medical records (including all inpatient and outpatient care provided by all local providers) regarding RA, ischemic heart disease, and traditional CV risk factors was collected. Cox models were used to estimate the effect of RA on the development of CHF, adjusting for CV risk factors and/or ischemic heart disease. The study population included 575 patients with RA and 583 subjects without RA. The CHF incidence rates were 1.99 and 1.16 cases per 100 person-years in patients with RA and in non-RA subjects, respectively (rate ratio 1.7, 95% confidence interval [95% CI] 1.3-2.1). After 30 years of followup, the cumulative incidence of CHF was 34.0% in patients with RA and 25.2% in non-RA subjects (P< 0.001). RA conferred a significant excess risk of CHF (hazard ratio [HR] 1.87, 95% CI 1.47-2.39) after adjusting for demographics, ischemic heart disease, and CV risk factors. The risk was higher among patients with RA who were rheumatoid factor (RF) positive (HR 2.59, 95% CI 1.95-3.43) than among those who were RF negative (HR 1.28, 95% CI 0.93-1.78). Compared with persons without RA, patients with RA have twice the risk of developing CHF. This excess risk is not explained by traditional CV risk factors and/or clinical ischemic heart disease.
A non-parametric test for association with censored data
  • Peter O'brien
O'Brien, Peter, "A non-parametric test for association with censored data", Biometrics 34:243-250, 1978.