197 resultados para slow science


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter to the Editor comments on the article On the limitations of probability in conceptualizing pattern matches in forensic science by P. T. Jayaprakash (Forensic Science International

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A criminal investigation requires to search and to interpret vestiges of a criminal act that happened in a past time. The forensic investigator arises in this context as a critical reader of the investigation scene, in search of physical traces that should enable her to tell a story of the offence/crime which allegedly occurred. The challenge of any investigator is to detect and recognise relevant physical traces in order to provide forensic clues for investigation and intelligence purposes. Inspired by this obser- vation, the current research focuses on the following questions : What is a relevant physical trace? And, how does the forensic investigator know she is facing one ? The interest of such questions is to provide a definition of a dimension often used in forensic science but never studied in its implications and operations. This doctoral research investigates scientific paths that are not often explored in forensic science, by using semiotic and sociological tools combined with statistical data analysis. The results are shown following a semiotic path, strongly influenced by Peir- ce's studies, and a second track, called empirical, where investigations data were analysed and forensic investigators interviewed about their work practices in the field. By the semiotic track, a macroscopic view is given of a signification process running from the discove- red physical trace at the scene to what is evaluated as being relevant for the investigator. The physical trace is perceived in the form of several signs, whose meaning is culturally codified. The reasoning should consist of three main steps : 1- What kind of source does the discovered physical trace refer to ? 2- What cause/activity is at the origin of this source in the specific context of the case ? 3- What story can be told from these observations ? The stage 3 requires to reason in creating hypotheses that should explain the presence of the discovered trace coming from an activity ; the specific activity that is related to the investigated case. To validate this assumption, it would depend on their ability to respond to a rule of relevancy. The last step is the symbolisation of the relevancy. The rule would consist of two points : the recognition of the factual/circumstantial relevancy (Is the link between the trace and the case recognised with the formulated hypothesis ? ) and appropriate relevancy (What investment is required to collect and analyse the discovered trace considering the expected outcome at the investigation/intelligence level?). This process of meaning is based on observations and a conjectural reasoning subject to many influences. In this study, relevancy in forensic science is presented as a conventional dimension that is symbolised and conditioned by the context, the forensic investigator's practice and her workplace environment (culture of the place). In short, the current research states relevancy results of the interactions between parameters from situational, structural (or organisational) and individual orders. The detection, collection and analysis of relevant physical traces at scenes depends on the knowledge and culture mastered by the forensic investigator. In the study of the relation relevant trace-forensic investigator, this research introduces the KEE model as a conceptual map that illustrates three major areas of forensic knowledge and culture acquisition, involved in the research and evaluation of the relevant physical trace. Through the analysis of the investigation data and interviews, the relationship between those three parameters and the relevancy was highlighted. K, for knowing, embodies a rela- tionship to the immediate knowledge allowing to give an overview of the reality at a specific moment ; an important point since relevancy is signified in a context. E, for education, is considered through its relationship with relevancy via a culture that tends to become institutionalised ; it represents the theoretical knowledge. As for the parameter E, for experience, it exists in its relation to relevancy in the adjustments of the strategies of intervention (i.e a practical knowledge) of each practitioner having modulated her work in the light of success and setbacks case after case. The two E parameters constitute the library resources for the semiotic recognition process and the K parameter ensures the contextualisation required to set up the reasoning and to formulate explanatory hypotheses for the discovered physical traces, questioned in their relevancy. This research demonstrates that the relevancy is not absolute. It is temporal and contextual; it is a conventional and relative dimension that must be discussed. This is where the whole issue of the meaning of what is relevant to each stakeholder of the investigation process rests. By proposing a step by step approach to the meaning process from the physical trace to the forensic clue, this study aims to provide a more advanced understanding of the reasoning and its operation, in order to streng- then forensic investigators' training. This doctoral research presents a set of tools critical to both pedagogical and practical aspects for crime scene management while identifying key-influences with individual, structural and situational dimensions. - Une enquête criminelle consiste à rechercher et à faire parler les vestiges d'un acte incriminé passé. L'investigateur forensique se pose dans ce cadre comme un lecteur critique des lieux à la recherche de traces devant lui permettre de former son récit, soit l'histoire du délit/crime censé s'être produit. Le challenge de tout investigateur est de pouvoir détecter et reconnaître les traces dites pertinentes pour fournir des indices forensiques à buts d'enquête et de renseignement. Inspirée par un tel constat, la présente recherche pose au coeur de ses réflexions les questions suivantes : Qu'est-ce qu'une trace pertinente ? Et, comment fait le forensicien pour déterminer qu'il y fait face ? L'intérêt de tels questionnements se comprend dans la volonté de définir une dimension souvent utili- sée en science forensique, mais encore jamais étudiée dans ses implications et fonctionnements. Cette recherche se lance dans des voies d'étude encore peu explorées en usant d'outils sémiotiques et des pratiques d'enquêtes sociologiques combinés à des traitements statistiques de données. Les résultats sont représentés en suivant une piste sémiotique fortement influencée par les écrits de Peirce et une seconde piste dite empirique où des données d'interventions ont été analysées et des investigateurs forensiques interviewés sur leurs pratiques de travail sur le terrain. Par la piste sémiotique, une vision macroscopique du processus de signification de la trace en élément pertinent est représentée. La trace est perçue sous la forme de plusieurs signes dont la signification est codifiée culturellement. Le raisonnement se formaliserait en trois principales étapes : 1- Quel type de source évoque la trace détectée? 2- Quelle cause/activité est à l'origine de cette source dans le contexte précis du cas ? 3- Quelle histoire peut être racontée à partir de ces observations ? Cette dernière étape consiste à raisonner en créant des hypothèses devant expliquer la présence de la trace détectée suite à une activité posée comme étant en lien avec le cas investigué. Pour valider ces hypothèses, cela dépendrait de leur capacité à répondre à une règle, celle de la pertinence. Cette dernière étape consiste en la symbolisation de la pertinence. La règle se composerait de deux points : la reconnaissance de la pertinence factuelle (le lien entre la trace et le cas est-il reconnu dans l'hypothèse fournie?) et la pertinence appropriée (quel est l'investissement à fournir dans la collecte et l'exploitation de la trace pour le bénéfice attendu au niveau de l'investigation/renseignement?). Tout ce processus de signification se base sur des observations et un raisonnement conjectural soumis à de nombreuses influences. Dans cette étude, la pertinence en science forensique se formalise sous les traits d'une dimension conventionnelle, symbolisée, conditionnée par le contexte, la pratique de l'investigateur forensique et la culture du milieu ; en somme cette recherche avance que la pertinence est le fruit d'une interaction entre des paramètres d'ordre situationnel, structurel (ou organisationnel) et individuel. Garantir la détection, la collecte et l'exploitation des traces pertinentes sur les lieux dépend de la connaissance et d'une culture maîtrisées par le forensicien. Dans l'étude du rapport trace pertinente-investigateur forensique, la présente recherche pose le modèle SFE comme une carte conceptuelle illustrant trois grands axes d'acquisition de la connaissance et de la culture forensiques intervenant dans la recherche et l'évaluation de la trace pertinente. Par l'analyse des données d'in- terventions et des entretiens, le rapport entre ces trois paramètres et la pertinence a été mis en évidence. S, pour savoir, incarne un rapport à la connaissance immédiate pour se faire une représentation d'une réalité à un instant donné, un point important pour une pertinence qui se comprend dans un contexte. F, pour formation, se conçoit dans son rapport à la pertinence via cette culture qui tend à s'institutionnaliser (soit une connaissance théorique). Quant au paramètre E, pour expérience, il se comprend dans son rapport à la pertinence dans cet ajustement des stratégies d'intervention (soit une connaissance pratique) de chaque praticien ayant modulé leur travail au regard des succès et échecs enregistrés cas après cas. F et E formeraient la bibliothèque de ressources permettant le processus de reconnaissance sémiotique et S assurerait la contextualisation nécessaire pour poser le raisonnement et formuler les hypothèses explicatives pour les traces détectées et questionnées dans leur pertinence. Ce travail démontre que la pertinence n'est pas absolue. Elle est temporelle et contextuelle, c'est une dimension conventionnelle relative et interprétée qui se doit d'être discutée. C'est là que repose toute la problématique de la signification de ce qui est pertinent pour chaque participant du processus d'investigation. En proposant une lecture par étapes du processus de signification depuis la trace à l'indice, l'étude vise à offrir une compréhension plus poussée du raisonnement et de son fonctionnement pour renforcer la formation des intervenants forensiques. Cette recherche présente ainsi un ensemble d'outils critiques à portée tant pédagogiques que pratiques pour la gestion des lieux tout en identifiant des influences-clé jouées par des dimensions individuelles, structurelles et situationnelles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forensic science is increasingly relied upon by law enforcement to assist in solvingcrime and gaining convictions, and by the judicial system in the adjudication ofspecific criminal cases. However, the value of forensic science relative to the workinvolved and the outcome of cases has yet to be established in the Australiancontext. Previous research in this area has mainly focused on the science andtechnology, rather than examining how people can use forensic services/science tothe best possible advantage to produce appropriate justice outcomes. This fiveyearproject entails an investigation into the effectiveness of forensic science inpolice investigations and court trials. It aims to identify when, where and howforensic science can add value to criminal investigations, court trials and justiceoutcomes while ensuring the efficient use of available resources initially in theVictorian and the ACT criminal justice systems and ultimately across Australiaand New Zealand. This paper provides an overview of the rationale and aims ofthe research project and discusses current work-in-progress.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistics occupies a prominent role in science and citizens' daily life. This article provides a state-of-the-art of the problems associated with statistics in science and in society, structured along the three paradigms defined by Bauer, Allum and Miller (2007). It explores in more detail medicine and public understanding of science on the one hand, and risks and surveys on the other. Statistics has received a good deal of attention; however, very often handled in terms of deficit - either of scientists or of citizens. Many tools have been proposed to improve statistical literacy, the image of and trust in statistics, but with little understanding of their roots, with little coordination among stakeholders and with few assessments of impacts. These deficiencies represent as many new and promising directions in which the PUS research agenda could be expanded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The end-Permian mass extinction removed more than 80% of marine genera. Ammonoid cephalopods were among the organisms most affected by this crisis. The analysis of a global diversity data set of ammonoid genera covering about 106 million years centered on the Permian-Triassic boundary (PTB) shows that Triassic ammonoids actually reached levels of diversity higher than in the Permian less than 2 million years after the PTB. The data favor a hierarchical rather than logistic model of diversification coupled with a niche incumbency hypothesis. This explosive and nondelayed diversification contrasts with the slow and delayed character of the Triassic biotic recovery as currently illustrated for other, mainly benthic groups such as bivalves and gastropods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the antimitotic and toxic effects of 5-chlorouracil (5-CU) and 5-fluorouracil (5-FU) and study their potential to delay filtering bleb closure in the rabbit eye when released by poly(ortho esters) (POE). METHODS: Rabbit Tenon fibroblasts and human conjunctival cells were incubated with various 5-CU and 5-FU concentrations. Antiproliferative effects and toxicity were evaluated at 24 and 72 hours by monotetrazolium, neutral red, and Hoechst tests and cell counting. Mechanisms of cell death were evaluated using TUNEL assay, annexin V binding, immunohistochemistry for anti-apoptosis-inducing factor (AIF) and LEI/L-DNase II. Trabeculectomy was performed in pigmented rabbits. Two hundred microliters of POE loaded with 1% wt/wt 5-FU or 5-CU was injected into the subconjunctival space after surgery. Intraocular pressure (IOP) and bleb persistence were monitored for 150 days. RESULTS: In vitro, 5-FU showed a higher antiproliferative effect and a more toxic effect than 5-CU. 5-FU induced cell necrosis, whereas 5-CU induced mostly apoptosis. The apoptosis induced by 5-CU was driven through a non-caspase-dependent pathway involving AIF and LEI/L-DNase II. In vivo, at 34 days after surgery, the mean IOP in the POE/5-CU-treated group was 83% of the baseline level and only 40% in the POE/5-FU-treated group. At 100 days after surgery, IOP was still decreased in the POE/5-CU group when compared with the controls and still inferior to the preoperative value. The mean long-term IOP, with all time points considered, was significantly (P < 0.0001) decreased in the POE/5-CU-treated group (6.0 +/- 2.4 mm Hg) when compared with both control groups, the trabeculectomy alone group (7.6 +/- 2.9 mm Hg), and the POE alone group (7.5 +/- 2.6 mm Hg). Histologic analysis showed evidence of functioning blebs in the POE-5-CU-treated eyes along with a preserved structure of the conjunctiva epithelium. CONCLUSIONS: The slow release of 5-CU from POE has a longstanding effect on the decrease of IOP after glaucoma-filtering surgery in the rabbit eye. Thus, the slow release of POE/5-CU may be beneficial for the prevention of bleb closure in patients who undergo complicated trabeculectomy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents a global vision of images in forensic science. The proliferation of perspectives on the use of images throughout criminal investigations and the increasing demand for research on this topic seem to demand a forensic science-based analysis. In this study, the definitions of and concepts related to material traces are revisited and applied to images, and a structured approach is used to persuade the scientific community to extend and improve the use of images as traces in criminal investigations. Current research efforts focus on technical issues and evidence assessment. This article provides a sound foundation for rationalising and explaining the processes involved in the production of clues from trace images. For example, the mechanisms through which these visual traces become clues of presence or action are described. An extensive literature review of forensic image analysis emphasises the existing guidelines and knowledge available for answering investigative questions (who, what, where, when and how). However, complementary developments are still necessary to demystify many aspects of image analysis in forensic science, including how to review and select images or use them to reconstruct an event or assist intelligence efforts. The hypothetico-deductive reasoning pathway used to discover unknown elements of an event or crime can also help scientists understand the underlying processes involved in their decision making. An analysis of a single image in an investigative or probative context is used to demonstrate the highly informative potential of images as traces and/or clues. Research efforts should be directed toward formalising the extraction and combination of clues from images. An appropriate methodology is key to expanding the use of images in forensic science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract In this thesis we present the design of a systematic integrated computer-based approach for detecting potential disruptions from an industry perspective. Following the design science paradigm, we iteratively develop several multi-actor multi-criteria artifacts dedicated to environment scanning. The contributions of this thesis are both theoretical and practical. We demonstrate the successful use of multi-criteria decision-making methods for technology foresight. Furthermore, we illustrate the design of our artifacts using build and-evaluate loops supported with a field study of the Swiss mobile payment industry. To increase the relevance of this study, we systematically interview key Swiss experts for each design iteration. As a result, our research provides a realistic picture of the current situation in the Swiss mobile payment market and reveals previously undiscovered weak signals for future trends. Finally, we suggest a generic design process for environment scanning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract De nature interdisciplinaire, ma thèse de doctorat porte sur les récits littéraires traditionnellement nommés de «science-fiction» - peu étudiés dans le milieu universitaire jusqu'à maintenant - et, plus particulièrement, sur le dialogue critique que ce genre littéraire peut instituer entre tout lecteur et la société technoscientifique au sein de laquelle il évolue au quotidien. Afin que cette tâche puisse être menée de manière rigoureuse, j'ai été conduit à faire entrer en résonance divers champs cognitifs non nécessairement liés a priori, à savoir les études littéraires (théorie de la mimèsis, théorie de la réception, théorie schaefferienne de la fiction et situations-limites sartriennes), anthropologiques (fonction du récit chez Jean Molino et Jérôme Brunner), économiques (histoire et théories du libéralisme), philosophiques (méditations heisenbergienne et heideggérienne sur la technique) et épistémologiques (liens entre technologies, sciences et société). En débutant par une réflexion historique sur l'émergence de la science moderne et du libéralisme économique, il s'agissait pour moi de montrer qu'entre ces deux savoirs, une alliance originale s'est nouée et ce, dès la fin du XVIIIe siècle : l'un a besoin de l'autre, et réciproquement. Il est ensuite mis en lumière qu'à la naissance de la science-fiction à la fin du XIXe siècle, correspond la nécessité, pour certains écrivains du moins, de réfléchir à l'évolution technoscientifique prise par la société de leur temps, ainsi qu'aux conséquences que ce progrès pourrait induire sur l'être humain en tant que sujet moral - une problématisation du rapport homme/technoscience par le biais de la fiction, donc. Lors de cette partie, je discute, d'une part, des éléments fondamentaux qui sont essentiels pour établir une poétique originale de la science-fiction - la conjecture, la structure des univers et les thèmes - et, d'autre, part, des distinctions génériques qu'il est important de discerner - j'en délimite trois : science-fiction «apologétique », «neutre » et «critique ». Finalement, mon travail se conclut par une réflexion, à partir de la pensée de Hans Jonas et de Jean-Pierre Dupuy, sur la fonction active que peuvent jouer les romans de science-fiction au niveau éthique. Pour le dire autrement, la fin de mon étude propose, en premier lieu, d'esquisser une «pragmatique de la science-fiction» ; puis, en second lieu, mon enquête stipule que la prise en charge sérieuse de ces récits conduit à la possibilité de concevoir une forme particulière de «catastrophisme éclairé », que j'appelle «heuristique du catastrophisme éclairé » conjurer les dangers et les périls qui pourraient éventuellement nous menacer, c'est avant tout croire qu'ils pourraient se réaliser si on ne les pense pas ou si on n'agit pas. Cette étape, souhaitée par les éthiciens, est justement celle qui me semble caractériser les récits de science-fiction critique. En ce sens, je montre que si la science-fiction écrit «demain », ce n'est en tout cas pas pour le fantasmer ou le prédire, mais, au contraire, pour mieux penser ce qui lui donne forme : l'« aujourd'hui ».

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Methadone inhibits the cardiac potassium channel hERG and can cause a prolonged QT interval. Methadone is chiral but its therapeutic activity is mainly due to (R)-methadone. Whole-cell patch-clamp experiments using cells expressing hERG showed that (S)-methadone blocked the hERG current 3.5-fold more potently than (R)-methadone (IC50s (half-maximal inhibitory concentrations) at 37 degrees C: 2 and 7 microM). As CYP2B6 slow metabolizer (SM) status results in a reduced ability to metabolize (S)-methadone, electrocardiograms, CYP2B6 genotypes, and (R)- and (S)-methadone plasma concentrations were obtained for 179 patients receiving (R,S)-methadone. The mean heart-rate-corrected QT (QTc) was higher in CYP2B6 SMs (*6/*6 genotype; 439+/-25 ms; n=11) than in extensive metabolizers (non *6/*6; 421+/-25 ms; n=168; P=0.017). CYP2B6 SM status was associated with an increased risk of prolonged QTc (odds ratio=4.5, 95% confidence interval=1.2-17.7; P=0.03). This study reports the first genetic factor implicated in methadone metabolism that may increase the risk of cardiac arrhythmias and sudden death. This risk could be reduced by the administration of (R)-methadone.