Figure 2 - uploaded by Annamaria Carusi
Content may be subject to copyright.
Voltage Clamp Apparatus Two Electrode Voltage Clamp Apparatus. From https://www.warneronline.com/product_info.cfm?name=TEV-700+Two+Electrode +Voltage+Clamp+Workstation&id=170.  

Voltage Clamp Apparatus Two Electrode Voltage Clamp Apparatus. From https://www.warneronline.com/product_info.cfm?name=TEV-700+Two+Electrode +Voltage+Clamp+Workstation&id=170.  

Source publication
Article
Full-text available
In silico medicine is still forging a road for itself in the current biomedical landscape. Discursively and rhetorically, it is using a three-way positioning, first, deploying discourses of personalised medicine, second, extending the 3Rs from animal to clinical research, and third, aligning its methods with experimental methods. The discursive and...

Context in source publication

Context 1
... in academic laboratories, a typical methodology for clinicians in this domain are voltage clamp experiments, that control the flow of conductances across the ion channels of the cell membrane. Voltage experiments typically involve the kind of apparatus shown in Figure 2. Computational modelling and simulation instead typically involve apparatus such as PCs, software packages such as Matlab, ...

Citations

... The term "social" can include a broad array of factors. In the case of this project, a first main focus was on understanding the mediation of interdisciplinary collaboration through structures similar to the AOP framework, informed by a broad literature on scientific tools as mediators [a classic being (20)]; with an application to biosciences in (21). The second main focus was the role such frameworks can have in the nature of the inter-relationships among participants contributing via a crowdsourcing model: their motivations, their rewards, and the unfolding of the collaboration over disagreements as well as agreements. ...
Article
Full-text available
Introduction The CIAO project was launched in Spring 2020 to address the need to make sense of the numerous and disparate data available on COVID-19 pathogenesis. Based on a crowdsourcing model of large-scale collaboration, the project has exploited the Adverse Outcome Pathway (AOP) knowledge management framework built to support chemical risk assessment driven by mechanistic understanding of the biological perturbations at the different organizational levels. Hence the AOPs might have real potential to integrate data produced through different approaches and from different disciplines as experienced in the context of COVID-19. In this study, we aim to address the effectiveness of the AOP framework (i) in supporting an interdisciplinary collaboration for a viral disease and (ii) in working as the conceptual mediator of a crowdsourcing model of collaboration. Methods We used a survey disseminated among the CIAO participants, a workshop open to all interested CIAO contributors, a series of interviews with some participants and a self-reflection on the processes. Results The project has supported genuine interdisciplinarity with exchange of knowledge. The framework provided a common reference point for discussion and collaboration. The diagram used in the AOPs assisted with making explicit what are the different perspectives brought to the knowledge about the pathways. The AOP-Wiki showed up many aspects about its usability for those not already in the world of AOPs. Meanwhile their use in CIAO highlighted needed adaptations. Introduction of new Wiki elements for modulating factors was potentially the most disruptive one. Regarding how well AOPs support a crowdsourcing model of large-scale collaboration, the CIAO project showed that this is successful when there is a strong central organizational impetus and when clarity about the terms of the collaboration is brought as early as possible. Discussion Extrapolate the successful CIAO approach and related processes to other areas of science where the AOP could foster interdisciplinary and systematic organization of the knowledge is an exciting perspective.
... The 3R approach greatly influences common procedures for the definition, the design, the optimization, the reliability assessment and the certification of surgical procedures, instrumentations and devices. Aiming at this ethical and mandatory 3R approach, the competences and the skills of bioengineers can play a meaningful role [4]. Computational biomechanics provides structural modeling tools that allow for simulating the mechanical behavior of anatomical regions and the interaction phenomena occurring among such biological structures and surgical instrumentations [5] and/or prosthetic devices [6]. ...
Article
Full-text available
Biomechanical investigations of surgical procedures and devices are usually developed by means of human or animal models. The exploitation of computational methods and tools can reduce, refine, and replace (3R) the animal experimentations for scientific purposes and for pre-clinical research. The computational model of a biological structure characterizes both its geometrical conformation and the mechanical behavior of its building tissues. Model development requires coupled experimental and computational activities. Medical images and anthropometric information provide the geometrical definition of the computational model. Histological investigations and mechanical tests on tissue samples allow for characterizing biological tissues’ mechanical response by means of constitutive models. The assessment of computational model reliability requires comparing model results and data from further experimentations. Computational methods allow for the in-silico analysis of surgical procedures and devices’ functionality considering many different influencing variables, the experimental investigation of which should be extremely expensive and time consuming. Furthermore, computational methods provide information that experimental methods barely supply, as the strain and the stress fields that regulate important mechano-biological phenomena. In this work, general notes about the development of biomechanical tools are proposed, together with specific applications to different fields, as dental implantology and bariatric surgery.
... In other words, Schork suggests that data should be harvested in the course of daily treatment, and that all patients should be regarded as research participants (Hogle, 2016b). 2 With N-of-1 trials in daily clinical practice, Schork thus imagines individuals contributing to the population in the hope that population data can guide their own and other patients' treatment. Such plans necessitate a centralized IT infrastructure for health data (Carusi, 2016). Heavy investments in collection, storage and use of health data are therefore key components of the personalized medicine agenda (Prainsack, 2015). ...
Article
Full-text available
‘Personalized medicine’ might sound like the very antithesis of population science and public health, with the individual taking the place of the population. However, in practice, personalized medicine generates heavy investments in the population sciences – particularly in data-sourcing initiatives. Intensified data sourcing implies new roles and responsibilities for patients and health professionals, who become responsible not only for data contributions, but also for responding to new uses of data in personalized prevention, drawing upon detailed mapping of risk distribution in the population. Although this population-based ‘personalization’ of prevention and treatment is said to be about making the health services ‘data-driven’, the policies and plans themselves use existing data and evidence in a very selective manner. It is as if data-driven decision-making is a promise for an unspecified future, not a demand on its planning in the present. I therefore suggest interrogating how ‘promissory data’ interact with ideas about accountability in public health policies, and also with the data initiatives that the promises bring about. Intensified data collection might not just be interesting for what it allows authorities to do and know, but also for how its promises of future evidence can be used to postpone action and sidestep uncomfortable knowledge in the present.
Article
Regulating industrial chemicals in foodstuffs and consumer products is a major aspect of protecting populations against health risks. Non-animal testing methods are an essential part of the radical change to the framework for toxicity testing that is long overdue in global economies. This paper discusses reasons why the drive to reduce animal testing for chemical safety testing is so difficult to achieve, as perceived by those who are closely involved in chemicals regulations in different capacities. Progress is slow, despite the fact that the ethico-legal conditions for a move away from animal testing are largely in place, and despite scientific arguments for a radical change in the paradigm of toxicity testing, away from reliance on animal studies. I present empirical data drawn from two studies in a European Commission context promoting non-animal methods. The aim of the paper is modest. It is to foreground the voices of those who deal with the science and regulation of chemicals on a day-to-day basis, rather than to offer a theoretical framework for what I heard from them. I offer a synthesis of the main challenges faced by non-animal alternatives, as these are perceived by people in different stakeholder groups dealing with chemicals regulation. I show where there are pockets of agreement between different stakeholders, and where the main disagreements lie. In particular there is dispute and disagreement over what counts as validation of these alternative tests, and by implication of the traditional ‘gold standard’ of animal testing. Finally, I suggest that the shift to non-animal methods in chemicals regulation demonstrates the need for the concept of validation to be broadened from a purely techno-scientific definition, and be more explictly understood as a demand for trust and acceptance, with more attention given to the complex social, institutional and economic settings in which it operates.
Article
On May 25, 2018, the European Union’s General Data Protection Regulation (GDPR) came into force. EU citizens are granted more control over personal data while companies and organizations are charged with increased responsibility enshrined in broad principles like transparency and accountability. Given the scope of the regulation, which aims to harmonize data practices across 28 member states with different concerns about data collection, the GDPR has significant consequences for individuals in the EU and globally. While the GDPR is primarily intended to regulate tech companies, it also has important implications for data use in scientific research. Drawing on ethnographic fieldwork with researchers, lawyers and legal scholars in Sweden, I argue that the GDPR’s flexible accountability principle effectively encourages researchers to reflect on their ethical responsibility but can also become a source of anxiety and produce unexpected results. Many researchers I spoke with expressed profound uncertainty about ‘impossible’ legal requirements for research data use. Despite the availability of legal texts and interpretations, I suggest we should take researchers’ concerns about ‘unknowable’ data law seriously. Many researchers’ sense of legal ambiguity led them to rethink their data practices and themselves as ethical subjects through an orientation to what they imagined as the ‘real people behind the data’, variously formulated as a Swedish population desiring data use for social benefit or a transnational public eager for research results. The intentions attributed to people, populations and publics – whom researchers only encountered in the abstract form of data – lent ethical weight to various and sometimes conflicting decisions about data security and sharing. Ultimately, researchers’ anxieties about their inability to discern the desires of the ‘real people’ lent new appeal to solutions, however flawed, that promised to alleviate the ethical burden of personal data.
Article
This paper offers a contribution to debates around integrative aspects of systems biology and engages with issues related to the circumstances under which physicists look at biological problems. We use oral history as one of the methodological tools to gather the empirical material, conducting interviews with physicists working in systems biology. The interviews were conducted at several institutions in Brazil, Germany, Israel and the U.S. Biological research has been increasingly dependent on computational methods, high-throughput technologies, and multidisciplinary skills. Quantitative scientists are joining biological departments and collaborations between physicists and biologists are particularly vigorous. This state of affairs raises a number of questions, such as: What are the circumstances under which physicists approach biological problems in systems biology? What kind of interdisciplinary challenges must be tackled? The paper suggests that, concerning physicists’ move to work on biological systems, there are common reasons to move, the transition must be understood in terms of degrees, physicists have a rationale for simplifying systems, and distinct conceptions of model and modeling strategies are recurrent. We identified problems regarding linguistic clarity and integration of epistemological aims. We conclude that cultural unconformities within the systems biology community have important consequences to the flow of scientific knowledge.
Article
Full-text available
In experimental settings, scientists often “make” new things, in which case the aim is to intervene in order to produce experimental objects and processes—characterized as ‘effects’. In this discussion, I illuminate an important performative function in measurement and experimentation in general: intervention-based experimental production (IEP). I argue that even though the goal of IEP is the production of new effects, it can be informative for causal details in scientific representations. Specifically, IEP can be informative about causal relations in: regularities under study; ‘intervention systems’, which are measurement/experimental systems; and new technological systems.