982 resultados para Second Step


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The widespread use of digital imaging devices for surveillance (CCTV) and entertainment (e.g., mobile phones, compact cameras) has increased the number of images recorded and opportunities to consider the images as traces or documentation of criminal activity. The forensic science literature focuses almost exclusively on technical issues and evidence assessment [1]. Earlier steps in the investigation phase have been neglected and must be considered. This article is the first comprehensive description of a methodology to event reconstruction using images. This formal methodology was conceptualised from practical experiences and applied to different contexts and case studies to test and refine it. Based on this practical analysis, we propose a systematic approach that includes a preliminary analysis followed by four main steps. These steps form a sequence for which the results from each step rely on the previous step. However, the methodology is not linear, but it is a cyclic, iterative progression for obtaining knowledge about an event. The preliminary analysis is a pre-evaluation phase, wherein potential relevance of images is assessed. In the first step, images are detected and collected as pertinent trace material; the second step involves organising and assessing their quality and informative potential. The third step includes reconstruction using clues about space, time and actions. Finally, in the fourth step, the images are evaluated and selected as evidence. These steps are described and illustrated using practical examples. The paper outlines how images elicit information about persons, objects, space, time and actions throughout the investigation process to reconstruct an event step by step. We emphasise the hypothetico-deductive reasoning framework, which demonstrates the contribution of images to generating, refining or eliminating propositions or hypotheses. This methodology provides a sound basis for extending image use as evidence and, more generally, as clues in investigation and crime reconstruction processes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective of this paper is to analyze why firms in some industries locate in specialized economic environments (localization economies) while those in other industries prefer large city locations (urbanization economies). To this end, we examine the location decisions of new manufacturing firms in Spain at the city level and for narrowly defined industries (three-digit level). First, we estimate firm location models to obtain estimates that reflect the importance of localization and urbanization economies in each industry. In a second step, we regress these estimates on industry characteristics that are related to the potential importance of three agglomeration theories, namely, labor market pooling, input sharing and knowledge spillovers. Localization effects are low and urbanization effects are high in knowledge-intensive industries, suggesting that firms (partly) locate in large cities to reap the benefits of inter-industry knowledge spillovers. We also find that localization effects are high in industries that employ workers whose skills are more industry-specific, suggesting that industries (partly) locate in specialized economic environments to share a common pool of specialized workers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Investigating macro-geographical genetic structures of animal populations is crucial to reconstruct population histories and to identify significant units for conservation. This approach may also provide information about the intraspecific flexibility of social systems. We investigated the history and current structure of a large number of populations in the communally breeding Bechstein's bat (Myotis bechsteinii). Our aim was to understand which factors shape the species' social system over a large ecological and geographical range. Using sequence data from one coding and one noncoding mitochondrial DNA region, we identified the Balkan Peninsula as the main and probably only glacial refugium of the species in Europe. Sequence data also suggest the presence of a cryptic taxon in the Caucasus and Anatolia. In a second step, we used seven autosomal and two mitochondrial microsatellite loci to compare population structures inside and outside of the Balkan glacial refugium. Central European and Balkan populations both were more strongly differentiated for mitochondrial DNA than for nuclear DNA, had higher genetic diversities and lower levels of relatedness at swarming (mating) sites than in maternity (breeding) colonies, and showed more differentiation between colonies than between swarming sites. All these suggest that populations are shaped by strong female philopatry, male dispersal, and outbreeding throughout their European range. We conclude that Bechstein's bats have a stable social system that is independent from the postglacial history and location of the populations. Our findings have implications for the understanding of the benefits of sociality in female Bechstein's bats and for the conservation of this endangered species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVES: The suprascapular nerve (SSN) block is frequently performed for different shoulder pain conditions and for perioperative and postoperative pain control after shoulder surgery. Blind and image-guided techniques have been described, all of which target the nerve within the supraspinous fossa or at the suprascapular notch. This classic target point is not always ideal when ultrasound (US) is used because it is located deep under the muscles, and hence the nerve is not always visible. Blocking the nerve in the supraclavicular region, where it passes underneath the omohyoid muscle, could be an attractive alternative. METHODS: In the first step, 60 volunteers were scanned with US, both in the supraclavicular and the classic target area. The visibility of the SSN in both regions was compared. In the second step, 20 needles were placed into or immediately next to the SSN in the supraclavicular region of 10 cadavers. The accuracy of needle placement was determined by injection of dye and following dissection. RESULTS: In the supraclavicular region of volunteers, the nerve was identified in 81% of examinations (95% confidence interval [CI], 74%-88%) and located at a median depth of 8 mm (interquartile range, 6-9 mm). Near the suprascapular notch (supraspinous fossa), the nerve was unambiguously identified in 36% of examinations (95% CI, 28%-44%) (P < 0.001) and located at a median depth of 35 mm (interquartile range, 31-38 mm; P < 0.001). In the cadaver investigation, the rate of correct needle placement of the supraclavicular approach was 95% (95% CI, 86%-100%). CONCLUSIONS: Visualization of the SSN with US is better in the supraclavicular region as compared with the supraspinous fossa. The anatomic dissections confirmed that our novel supraclavicular SSN block technique is accurate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

En els darrers anys les organitzacions han anat agafant consciència de la importància de la gestió de la seva informació. D'aquesta necessitat d'organitzar les pròpies dades d'una forma coherent i eficient per al seu anàlisi sorgeixen els magatzems de dades. El projecte "Construcció i explotació d'un magatzem de dades per a l'anàlisi estadístic dels resultats del Campionat de Fórmula 1" s'emmarca dins aquest context i te com a objectiu el disseny d'un magatzem de dades per a cobrir les necessitats del Institut Català d'Esports de Motor (ICEM) en la gestió de les seves dades referents a la Fórmula 1. En aquest projecte s'han executat les diferents fases necessàries per a la creació d'un nou magatzem. En primer lloc s'ha fet l'anàlisi de requeriments i de les dades disponibles. A continuació s'ha fet el disseny i la implementació física del magatzem de dades. Seguidament s'ha procedit amb l'extracció, transformació i càrrega de dades originals (ETL) i per finalitzar s'han creat els informes. Com a resultat s'han obtingut una sèrie d'informes per a ser consumits de manera immediata pels usuaris del ICEM. Aquests informes han de permetre l'anàlisi de les dades d'una forma senzilla i àgil. A més s'ha implementat un sistema d'actualització automàtic de les dades que per al manteniment de la informació del magatzem.La implementació del magatzem de dades s'ha fet sobre una base de dades Oracle 10g Express Edition i els informes s'ha dissenyat amb l'eina Oracle Discoverer. Per a l'automatització de les dades s'ha fet servir a més visual basic script i l'eina de càrrega SQL*Loader.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper provides a new and accessible approach to establishing certain results concerning the discounted penalty function. The direct approach consists of two steps. In the first step, closed-form expressions are obtained in the special case in which the claim amount distribution is a combination of exponential distributions. A rational function is useful in this context. For the second step, one observes that the family of combinations of exponential distributions is dense. Hence, it suffices to reformulate the results of the first step to obtain general results. The surplus process has downward and upward jumps, modeled by two independent compound Poisson processes. If the distribution of the upward jumps is exponential, a series of new results can be obtained with ease. Subsequently, certain results of Gerber and Shiu [H. U. Gerber and E. S. W. Shiu, North American Actuarial Journal 2(1): 48–78 (1998)] can be reproduced. The two-step approach is also applied when an independent Wiener process is added to the surplus process. Certain results are related to Zhang et al. [Z. Zhang, H. Yang, and S. Li, Journal of Computational and Applied Mathematics 233: 1773–1 784 (2010)], which uses different methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

AIM: To assess whether repeating a grade was associated with drug use among adolescents after controlling for personal, family and school-related variables, and whether there were differences between students in mandatory and post-mandatory school. METHODS: Data were drawn from the Catalonia Adolescent Health Survey, a cross-sectional study of in-school adolescents aged 14-19 y. The index group included 366 subjects who were repeating a grade at the time the survey was carried out (old-for-grade, OFG). A control group matched by gender, school and being one grade ahead was randomly chosen among all the subjects who had never repeated a grade. All statistically significant variables in the bivariate analysis were included in a multivariate analysis. In a second step, all analyses were repeated for students in mandatory (14-16 y) and post-mandatory (17-19 y) school. RESULTS: After controlling for background variables, subjects in the index group were more likely to perceive that most of their peers were using synthetic drugs and to have ever used them, to have bad grades and a worse relationship with their teachers. OFG students in mandatory school were more likely to have divorced parents, bad grades and have ever used synthetic drugs, whereas they were less likely to be regular drinkers. OFG students in post-mandatory school were more likely to have below average grades, to be regular smokers and to perceive that most of their peers used synthetic drugs. CONCLUSIONS: When background variables are taken into consideration, the relationship between repeating a grade and drug use is not so clear. By increasing the familial and academic support of adolescents with academic underachievement, we could reduce their drug consumption.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper proposes a field application of a high-level reinforcement learning (RL) control system for solving the action selection problem of an autonomous robot in cable tracking task. The learning system is characterized by using a direct policy search method for learning the internal state/action mapping. Policy only algorithms may suffer from long convergence times when dealing with real robotics. In order to speed up the process, the learning phase has been carried out in a simulated environment and, in a second step, the policy has been transferred and tested successfully on a real robot. Future steps plan to continue the learning process on-line while on the real robot while performing the mentioned task. We demonstrate its feasibility with real experiments on the underwater robot ICTINEU AUV

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The multiple endocrine neoplasia type 2A (MEN2A) is a monogenic disorder characterized by an autosomal dominant pattern of inheritance which is characterized by high risk of medullary thyroid carcinoma in all mutation carriers. Although this disorder is classified as a rare disease, the patients affected have a low life quality and a very expensive and continuous treatment. At present, MEN2A is diagnosed by gene sequencing after birth, thus trying to start an early treatment and by reduction of morbidity and mortality. We first evaluated the presence of MEN2A mutation (C634Y) in serum of 25 patients, previously diagnosed by sequencing in peripheral blood leucocytes, using HRM genotyping analysis. In a second step, we used a COLD-PCR approach followed by HRM genotyping analysis for non-invasive prenatal diagnosis of a pregnant woman carrying a fetus with a C634Y mutation. HRM analysis revealed differences in melting curve shapes that correlated with patients diagnosed for MEN2A by gene sequencing analysis with 100% accuracy. Moreover, the pregnant woman carrying the fetus with the C634Y mutation revealed a melting curve shape in agreement with the positive controls in the COLD-PCR study. The mutation was confirmed by sequencing of the COLD-PCR amplification product. In conclusion, we have established a HRM analysis in serum samples as a new primary diagnosis method suitable for the detection of C634Y mutations in MEN2A patients. Simultaneously, we have applied the increase of sensitivity of COLD-PCR assay approach combined with HRM analysis for the non-invasive prenatal diagnosis of C634Y fetal mutations using pregnant women serum.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A transportable Raman spectrometer was tested for the detection of illicit drugs seized during border controls. In a first step, the analysis methodology was optimized using reference substances such as diacetylmorphine (heroin), cocaine and amphetamine (as powder or liquid forms). Adequate focalisation distance and times of analysis, influence of daylight and artificial light sources, repeatability and limits of detection were studied. In a second step the applications and limitations of the technique to detect the illicit substances in different mixtures and containers was evaluated. Transportable Raman spectroscopy was found to be adequate for a rapid screen of liquids and powders for the detection and identification of controlled substances. Additionally, it had the advantage over other portable techniques, such as ion mobility spectrometry, of being non-destructive and capable of rapid analysis of large quantities of substances through containers such as plastic bags and glass bottles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: International comparisons of social inequalities in alcohol use have not been extensively investigated. The purpose of this study was to examine the relationship of country-level characteristics and individual socio-economic status (SES) on individual alcohol consumption in 33 countries. METHODS: Data on 101,525 men and women collected by cross-sectional surveys in 33 countries of the GENACIS study were used. Individual SES was measured by highest attained educational level. Alcohol use measures included drinking status and monthly risky single occasion drinking (RSOD). The relationship between individuals' education and drinking indicators was examined by meta-analysis. In a second step the individual level data and country data were combined and tested in multilevel models. As country level indicators we used the Purchasing Power Parity of the gross national income, the Gini coefficient and the Gender Gap Index. RESULTS: For both genders and all countries higher individual SES was positively associated with drinking status. Also higher country level SES was associated with higher proportions of drinkers. Lower SES was associated with RSOD among men. Women of higher SES in low income countries were more often RSO drinkers than women of lower SES. The opposite was true in higher income countries. CONCLUSION: For the most part, findings regarding SES and drinking in higher income countries were as expected. However, women of higher SES in low and middle income countries appear at higher risk of engaging in RSOD. This finding should be kept in mind when developing new policy and prevention initiatives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Sponsored by the Health Administrations of nine cantons, this study was conducted by the University Institute of Social and Preventive Medicine in Lausanne in order to assess how DRGs could be used within the Swiss context. A data base mainly provided by the Swiss VESKA statistics was used. The first step provided the transformation of Swiss diagnostic and intervention codes into US codes, allowing direct use of the Yale Grouper for DRG. The second step showed that the overall performance of DRG in terms of variability reduction of the length of stay was similar to the one observed in US; there are, however, problems when the homogeneity of medicotechnical procedures for DRG is considered. The third steps showed how DRG could be used as an account unit in hospital, and how costs per DRG could be estimated. Other examples of applications of DRG were examined, for example comparison of Casemix or length of stay between hospitals.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

According to the World Health Organization, 5.1% of blindnesses or visual impairments are related to corneal opacification. Cornea is a transparent tissue placed in front of the color of the eye. Its transparency is mandatory for vision. The ocular surface is a functional unit including the cornea and all the elements involved in maintaining its transparency i.e., the eyelids, the conjunctiva, the lymphoid tissue of the conjunctiva, the limbus, the lacrymal glands and the tear film. The destruction of the ocular surface is a disease caused by : traumatisms, infections, chronic inflammations, cancers, toxics, unknown causes or congenital abnormalities. The treatment of the ocular surface destruction requires a global strategy including all the elements that are involved in its physiology. The microenvironnement of the ocular surface must first be restored, i.e., the lids, the conjunctiva, the limbus and the structures that secrete the different layers of the tear film. In a second step, the transparency of the cornea can be reconstructed. A corneal graft performed in a healthy ocular surface microenvironnement will have a better survival rate. To achieve these goals, a thorough understanding of the renewal of the epitheliums and the role of the epithelial stem cells are mandatory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices