938 resultados para Quasi-analytical algorithms


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Drug abuse is a widespread problem affecting both teenagers and adults. Nitrous oxide is becoming increasingly popular as an inhalation drug, causing harmful neurological and hematological effects. Some gas chromatography-mass spectrometry (GC-MS) methods for nitrous oxide measurement have been previously described. The main drawbacks of these methods include a lack of sensitivity for forensic applications; including an inability to quantitatively determine the concentration of gas present. The following study provides a validated method using HS-GC-MS which incorporates hydrogen sulfide as a suitable internal standard allowing the quantification of nitrous oxide. Upon analysis, sample and internal standard have similar retention times and are eluted quickly from the molecular sieve 5Å PLOT capillary column and the Porabond Q column therefore providing rapid data collection whilst preserving well defined peaks. After validation, the method has been applied to a real case of N2O intoxication indicating concentrations in a mono-intoxication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We prove that automorphisms of the infinite binary rooted tree T2 do not yield quasi-isometries of Thompson's group F, except for the map which reverses orientation on the unit interval, a natural outer automorphism of F. This map, together with the identity map, forms a subgroup of Aut(T2) consisting of 2-adic automorphisms, following standard terminology used in the study of branch groups. However, for more general p, we show that the analgous groups of p-adic tree automorphisms do not give rise to quasiisometries of F(p).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many regional governments in developed countries design programs to improve the competitiveness of local firms. In this paper, we evaluate the effectiveness of public programs whose aim is to enhance the performance of firms located in Catalonia (Spain). We compare the performance of publicly subsidised companies (treated) with that of similar, but unsubsidised companies (non-treated). We use the Propensity Score Matching (PSM) methodology to construct a control group which, with respect to its observable characteristics, is as similar as possible to the treated group, and that allows us to identify firms which retain the same propensity to receive public subsidies. Once a valid comparison group has been established, we compare the respective performance of each firm. As a result, we find that recipient firms, on average, change their business practices, improve their performance, and increase their value added as a direct result of public subsidy programs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We ask whether MNEs’ experience of institutional quality and political risk within their “home” business environments influences their decisions to enter a given country. We set out an explicit theoretical model that allows for the possibility that firms from South source countries may, by virtue of their experience with poor institutional quality, derive a competitive advantage over firms from North countries with respect to investing in destinations in the South. We show that the experience gained by such MNEs of poorer institutional environments may result in their being more prepared to invest in other countries with correspondingly weak institutions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper develop and estimates a model of demand estimation for environmental public goods which allows for consumers to learn about their preferences through consumption experiences. We develop a theoretical model of Bayesian updating, perform comparative statics over the model, and show how the theoretical model can be consistently incorporated into a reduced form econometric model. We then estimate the model using data collected for two environmental goods. We find that the predictions of the theoretical exercise that additional experience makes consumers more certain over their preferences in both mean and variance are supported in each case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This technical report is a document prepared as a deliverable [D4.3 Report of the Interlinkages and forecasting prototype tool] of a EU project – DECOIN Project No. 044428 - FP6-2005-SSP-5A. The text is divided into 4 sections: (1) this short introductory section explains the purpose of the report; (2) the second section provides a general discussion of a systemic problem found in existing quantitative analysis of sustainability. It addresses the epistemological implications of complexity, which entails the need of dealing with the existence of Multiple-Scales and non-equivalent narratives (multiple dimensions/attributes) to be used to define sustainability issues. There is an unavoidable tension between a “steady-state view” (= the perception of what is going on now – reflecting a PAST --& PRESENT view of the reality) versus an “evolutionary view” (= the unknown transformation that we have to expect in the process of becoming of the observed reality and in the observer – reflecting a PRESENT --& FUTURE view of the reality). The section ends by listing the implications of these points on the choice of integrated packages of sustainability indicators; (3) the third section illustrates the potentiality of the DECOIN toolkit for the study of sustainability trade-offs and linkages across indicators using quantitative examples taken from cases study of another EU project (SMILE). In particular, this section starts by addressing the existence of internal constraints to sustainability (economic versus social aspects). The narrative chosen for this discussion focuses on the dark side of ageing and immigration on the economic viability of social systems. Then the section continues by exploring external constraints to sustainability (economic development vs the environment). The narrative chosen for this discussion focuses on the dark side of current strategy of economic development based on externalization and the “bubbles-disease”; (4) the last section presents a critical appraisal of the quality of energy data found in energy statistics. It starts with a discussion of the general goal of statistical accounting. Then it introduces the concept of multipurpose grammars. The second part uses the experience made in the activities of the DECOIN project to answer the question: how useful are EUROSTAT energy statistics? The answer starts with an analysis of basic epistemological problems associated with accounting of energy. This discussion leads to the acknowledgment of an important epistemological problem: the unavoidable bifurcations in the mechanism of accounting needed to generate energy statistics. By using numerical example the text deals with the following issues: (i) the pitfalls of the actual system of accounting in energy statistics; (ii) a critical appraisal of the actual system of accounting in BP statistics; (iii) a critical appraisal of the actual system of accounting in Eurostat statistics. The section ends by proposing an innovative method to represent energy statistics which can result more useful for those willing develop sustainability indicators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Matrix effects, which represent an important issue in liquid chromatography coupled to mass spectrometry or tandem mass spectrometry detection, should be closely assessed during method development. In the case of quantitative analysis, the use of stable isotope-labelled internal standard with physico-chemical properties and ionization behaviour similar to the analyte is recommended. In this paper, an example of the choice of a co-eluting deuterated internal standard to compensate for short-term and long-term matrix effect in the case of chiral (R,S)-methadone plasma quantification is reported. The method was fully validated over a concentration range of 5-800 ng/mL for each methadone enantiomer with satisfactory relative bias (-1.0 to 1.0%), repeatability (0.9-4.9%) and intermediate precision (1.4-12.0%). From the results obtained during validation, a control chart process during 52 series of routine analysis was established using both intermediate precision standard deviation and FDA acceptance criteria. The results of routine quality control samples were generally included in the +/-15% variability around the target value and mainly in the two standard deviation interval illustrating the long-term stability of the method. The intermediate precision variability estimated in method validation was found to be coherent with the routine use of the method. During this period, 257 trough concentration and 54 peak concentration plasma samples of patients undergoing (R,S)-methadone treatment were successfully analysed for routine therapeutic drug monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Distinguishing postmortem gas accumulations in the body due to natural decomposition and other phenomena such as gas embolism can prove a difficult task using purely Multi-Detector Computed Tomography (MDCT). The Radiological Alteration Index (RAI) was created with the intention to be able to identify bodies undergoing the putrefaction process based on the quantity of gas detected within the body. The flaw in this approach is the inability to absolutely determine putrefaction as the origin of gas volumes in cases of moderate alteration. The aim of the current study is to identify percentage compositions of O2, N2, CO2 and the presence of gases such as H2 and H2S within these sampling sites in order to resolve this complication. Materials and methods: All cases investigated in our University Center of Legal Medicine are undergoing a Post-Mortem Computed Tomography (PMCT)-scan before external examination or autopsy as a routine investigation. In the obtained images, areas of gas were characterized as 0, I, II or III based on the amount of gas present according to the RAI (1). The criteria for these characterizations were dependent of the site of gas, for example thoracic and abdominal cavities were graded as I (1 - 3cm gas), II (3 - 5cm gas) and III (>5cm gas). Cases showing gaseous sites with grade II or III were selected for this study. The sampling was performed under CT-guidance to target the regions to be punctured. Luer-lock PTFE syringes equipped with a three-way valve and needles were used to sample the gas directly (2). Gaseous samples were then analysed using gas chromatography coupled to a thermal conductivity detector (GC-TCD). The components present in the samples were expressed as a percentage of the overall gas present. Results: Up to now, we have investigated more than 40 cases using our standardized procedure for sampling and analysis of gas. O2, N2 and CO2 were present in most samples. The following distributions were found to correlate to gas origins of gas embolism/scuba diving accidents, trauma and putrefaction: ? Putrefaction → O2 = 1 - 5%; CO2 > 15%; N2 = 10 - 70%; H2 / H2S / CH4 variable presence ? Gas embolism/Scuba diving accidents → O2 and N2= varying percentages; CO2 > 20% ? Trauma → O2 = small percentage; CO2 < 15%; N2 > 65% H2 and H2S indicated levels of putrefaction along with methane which can also gauge environmental conditions or conditions of body storage/burial. Many cases showing large RAI values (advanced alteration) did reveal a radiological diagnosis which was in concordance with the interpretation of the gas composition. However, in certain cases (gas embolism, scuba divers) radiological interpretation was not possible and only chemical gas analysis was found to lead to the correct diagnosis, meaning that it provided complementary information to the radiological diagnosis. Conclusion: Investigation of postmortem gases is a useful tool to determine origin of gas generation which can aid the diagnosis of the cause of death. Levels of gas can provide information on stage of putrefaction and help to perform essential medico-legal diagnosis such as vital gas embolism.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Defining an efficient training set is one of the most delicate phases for the success of remote sensing image classification routines. The complexity of the problem, the limited temporal and financial resources, as well as the high intraclass variance can make an algorithm fail if it is trained with a suboptimal dataset. Active learning aims at building efficient training sets by iteratively improving the model performance through sampling. A user-defined heuristic ranks the unlabeled pixels according to a function of the uncertainty of their class membership and then the user is asked to provide labels for the most uncertain pixels. This paper reviews and tests the main families of active learning algorithms: committee, large margin, and posterior probability-based. For each of them, the most recent advances in the remote sensing community are discussed and some heuristics are detailed and tested. Several challenging remote sensing scenarios are considered, including very high spatial resolution and hyperspectral image classification. Finally, guidelines for choosing the good architecture are provided for new and/or unexperienced user.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forensic scientists have long detected the presence of drugs and their metabolites in biological materials using body fluids such as urine, blood and/or other biological liquids or tissues. For doping analysis, only urine has so far been collected. In recent years, remarkable advances in sensitive analytical techniques have encouraged the analysis of drugs in unconventional biological samples such as hair, saliva and sweat. These samples are easily collected, although drug levels are often lower than the corresponding levels in urine or blood. This chapter reviews recent studies in the detection of doping agents in hair, saliva and sweat. Sampling, analytical procedures and interpretation of the results are discussed in comparison with those obtained from urine and blood samples.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This dissertation focuses on the practice of regulatory governance, throughout the study of the functioning of formally independent regulatory agencies (IRAs), with special attention to their de facto independence. The research goals are grounded on a "neo-positivist" (or "reconstructed positivist") position (Hawkesworth 1992; Radaelli 2000b; Sabatier 2000). This perspective starts from the ontological assumption that even if subjective perceptions are constitutive elements of political phenomena, a real world exists beyond any social construction and can, however imperfectly, become the object of scientific inquiry. Epistemologically, it follows that hypothetical-deductive theories with explanatory aims can be tested by employing a proper methodology and set of analytical techniques. It is thus possible to make scientific inferences and general conclusions to a certain extent, according to a Bayesian conception of knowledge, in order to update the prior scientific beliefs in the truth of the related hypotheses (Howson 1998), while acknowledging the fact that the conditions of truth are at least partially subjective and historically determined (Foucault 1988; Kuhn 1970). At the same time, a sceptical position is adopted towards the supposed disjunction between facts and values and the possibility of discovering abstract universal laws in social science. It has been observed that the current version of capitalism corresponds to the golden age of regulation, and that since the 1980s no government activity in OECD countries has grown faster than regulatory functions (Jacobs 1999). Following an apparent paradox, the ongoing dynamics of liberalisation, privatisation, decartelisation, internationalisation, and regional integration hardly led to the crumbling of the state, but instead promoted a wave of regulatory growth in the face of new risks and new opportunities (Vogel 1996). Accordingly, a new order of regulatory capitalism is rising, implying a new division of labour between state and society and entailing the expansion and intensification of regulation (Levi-Faur 2005). The previous order, relying on public ownership and public intervention and/or on sectoral self-regulation by private actors, is being replaced by a more formalised, expert-based, open, and independently regulated model of governance. Independent regulation agencies (IRAs), that is, formally independent administrative agencies with regulatory powers that benefit from public authority delegated from political decision makers, represent the main institutional feature of regulatory governance (Gilardi 2008). IRAs constitute a relatively new technology of regulation in western Europe, at least for certain domains, but they are increasingly widespread across countries and sectors. For instance, independent regulators have been set up for regulating very diverse issues, such as general competition, banking and finance, telecommunications, civil aviation, railway services, food safety, the pharmaceutical industry, electricity, environmental protection, and personal data privacy. Two attributes of IRAs deserve a special mention. On the one hand, they are formally separated from democratic institutions and elected politicians, thus raising normative and empirical concerns about their accountability and legitimacy. On the other hand, some hard questions about their role as political actors are still unaddressed, though, together with regulatory competencies, IRAs often accumulate executive, (quasi-)legislative, and adjudicatory functions, as well as about their performance.