893 resultados para Feature Extraction Algorithms
Resumo:
The small Rho-family GTPase Cdc42 is critical for cell polarization and polarizes spontaneously in absence of upstream spatial cues. Spontaneous polarization is thought to require dynamic Cdc42 recycling through Guanine nucleotide Dissociation Inhibitor (GDI)-mediated membrane extraction and vesicle trafficking. Here, we describe a functional fluorescent Cdc42 allele in fission yeast, which demonstrates Cdc42 dynamics and polarization independent of these pathways. Furthermore, an engineered Cdc42 allele targeted to the membrane independently of these recycling pathways by an amphipathic helix is viable and polarizes spontaneously to multiple sites in fission and budding yeasts. We show that Cdc42 is highly mobile at the membrane and accumulates at sites of activity, where it displays slower mobility. By contrast, a near-immobile transmembrane domain-containing Cdc42 allele supports viability and polarized activity, but does not accumulate at sites of activity. We propose that Cdc42 activation, enhanced by positive feedback, leads to its local accumulation by capture of fast-diffusing inactive molecules.
Resumo:
This thesis is composed of three main parts. The first consists of a state of the art of the different notions that are significant to understand the elements surrounding art authentication in general, and of signatures in particular, and that the author deemed them necessary to fully grasp the microcosm that makes up this particular market. Individuals with a solid knowledge of the art and expertise area, and that are particularly interested in the present study are advised to advance directly to the fourth Chapter. The expertise of the signature, it's reliability, and the factors impacting the expert's conclusions are brought forward. The final aim of the state of the art is to offer a general list of recommendations based on an exhaustive review of the current literature and given in light of all of the exposed issues. These guidelines are specifically formulated for the expertise of signatures on paintings, but can also be applied to wider themes in the area of signature examination. The second part of this thesis covers the experimental stages of the research. It consists of the method developed to authenticate painted signatures on works of art. This method is articulated around several main objectives: defining measurable features on painted signatures and defining their relevance in order to establish the separation capacities between groups of authentic and simulated signatures. For the first time, numerical analyses of painted signatures have been obtained and are used to attribute their authorship to given artists. An in-depth discussion of the developed method constitutes the third and final part of this study. It evaluates the opportunities and constraints when applied by signature and handwriting experts in forensic science. A brief summary covering each chapter allows a rapid overview of the study and summarizes the aims and main themes of each chapter. These outlines presented below summarize the aims and main themes addressed in each chapter. Part I - Theory Chapter 1 exposes legal aspects surrounding the authentication of works of art by art experts. The definition of what is legally authentic, the quality and types of the experts that can express an opinion concerning the authorship of a specific painting, and standard deontological rules are addressed. The practices applied in Switzerland will be specifically dealt with. Chapter 2 presents an overview of the different scientific analyses that can be carried out on paintings (from the canvas to the top coat). Scientific examinations of works of art have become more common, as more and more museums equip themselves with laboratories, thus an understanding of their role in the art authentication process is vital. The added value that a signature expertise can have in comparison to other scientific techniques is also addressed. Chapter 3 provides a historical overview of the signature on paintings throughout the ages, in order to offer the reader an understanding of the origin of the signature on works of art and its evolution through time. An explanation is given on the transitions that the signature went through from the 15th century on and how it progressively took on its widely known modern form. Both this chapter and chapter 2 are presented to show the reader the rich sources of information that can be provided to describe a painting, and how the signature is one of these sources. Chapter 4 focuses on the different hypotheses the FHE must keep in mind when examining a painted signature, since a number of scenarios can be encountered when dealing with signatures on works of art. The different forms of signatures, as well as the variables that may have an influence on the painted signatures, are also presented. Finally, the current state of knowledge of the examination procedure of signatures in forensic science in general, and in particular for painted signatures, is exposed. The state of the art of the assessment of the authorship of signatures on paintings is established and discussed in light of the theoretical facets mentioned previously. Chapter 5 considers key elements that can have an impact on the FHE during his or her2 examinations. This includes a discussion on elements such as the skill, confidence and competence of an expert, as well as the potential bias effects he might encounter. A better understanding of elements surrounding handwriting examinations, to, in turn, better communicate results and conclusions to an audience, is also undertaken. Chapter 6 reviews the judicial acceptance of signature analysis in Courts and closes the state of the art section of this thesis. This chapter brings forward the current issues pertaining to the appreciation of this expertise by the non- forensic community, and will discuss the increasing number of claims of the unscientific nature of signature authentication. The necessity to aim for more scientific, comprehensive and transparent authentication methods will be discussed. The theoretical part of this thesis is concluded by a series of general recommendations for forensic handwriting examiners in forensic science, specifically for the expertise of signatures on paintings. These recommendations stem from the exhaustive review of the literature and the issues exposed from this review and can also be applied to the traditional examination of signatures (on paper). Part II - Experimental part Chapter 7 describes and defines the sampling, extraction and analysis phases of the research. The sampling stage of artists' signatures and their respective simulations are presented, followed by the steps that were undertaken to extract and determine sets of characteristics, specific to each artist, that describe their signatures. The method is based on a study of five artists and a group of individuals acting as forgers for the sake of this study. Finally, the analysis procedure of these characteristics to assess of the strength of evidence, and based on a Bayesian reasoning process, is presented. Chapter 8 outlines the results concerning both the artist and simulation corpuses after their optical observation, followed by the results of the analysis phase of the research. The feature selection process and the likelihood ratio evaluation are the main themes that are addressed. The discrimination power between both corpuses is illustrated through multivariate analysis. Part III - Discussion Chapter 9 discusses the materials, the methods, and the obtained results of the research. The opportunities, but also constraints and limits, of the developed method are exposed. Future works that can be carried out subsequent to the results of the study are also presented. Chapter 10, the last chapter of this thesis, proposes a strategy to incorporate the model developed in the last chapters into the traditional signature expertise procedures. Thus, the strength of this expertise is discussed in conjunction with the traditional conclusions reached by forensic handwriting examiners in forensic science. Finally, this chapter summarizes and advocates a list of formal recommendations for good practices for handwriting examiners. In conclusion, the research highlights the interdisciplinary aspect of signature examination of signatures on paintings. The current state of knowledge of the judicial quality of art experts, along with the scientific and historical analysis of paintings and signatures, are overviewed to give the reader a feel of the different factors that have an impact on this particular subject. The temperamental acceptance of forensic signature analysis in court, also presented in the state of the art, explicitly demonstrates the necessity of a better recognition of signature expertise by courts of law. This general acceptance, however, can only be achieved by producing high quality results through a well-defined examination process. This research offers an original approach to attribute a painted signature to a certain artist: for the first time, a probabilistic model used to measure the discriminative potential between authentic and simulated painted signatures is studied. The opportunities and limits that lie within this method of scientifically establishing the authorship of signatures on works of art are thus presented. In addition, the second key contribution of this work proposes a procedure to combine the developed method into that used traditionally signature experts in forensic science. Such an implementation into the holistic traditional signature examination casework is a large step providing the forensic, judicial and art communities with a solid-based reasoning framework for the examination of signatures on paintings. The framework and preliminary results associated with this research have been published (Montani, 2009a) and presented at international forensic science conferences (Montani, 2009b; Montani, 2012).
Resumo:
Työn tavoitteena oli mallintaa uuden tuoteominaisuuden aiheuttamat lisäkustannukset ja suunnitella päätöksenteon työkalu Timberjack Oy:n kuormatraktorivalmistuksen johtoryhmälle. Tarkoituksena oli luoda karkean tason malli, joka sopisi eri tyyppisten tuoteominaisuuksien kustannuksien selvittämiseen. Uuden tuoteominaisuuden vaikutusta yrityksen eri toimintoihin selvitettiin haastatteluin. Haastattelukierroksen tukena käytettiin kysymyslomaketta. Haastattelujen tavoitteena oli selvittää prosessit, toiminnot ja resurssit, jotka ovat välttämättömiä uuden tuoteominaisuuden tuotantoon saattamisessa ja tuotannossa. Malli suunniteltiin haastattelujen ja tietojärjestelmästä hankitun tiedon pohjalta. Mallin rungon muodostivat ne prosessit ja toiminnot, joihin uudella tuoteominaisuudella on vaikutusta. Huomioon otettiin sellaiset resurssit, joita uusi tuoteominaisuus kuluttaa joko välittömästi, tai välillisesti. Tarkasteluun sisällytettiin ainoastaan lisäkustannukset. Uuden tuoteominaisuuden toteuttamisesta riippumattomat, joka tapauksessa toteutuvat yleiskustannukset jätettiin huomioimatta. Malli on yleistys uuden tuoteominaisuuden aiheuttamista lisäkustannuksista, koska tarkoituksena on, että se sopii eri tyyppisten tuoteominaisuuksien aiheuttamien kustannusten selvittämiseen. Lisäksi malli soveltuu muiden pienehköjen tuotemuutosten kustannusten kartoittamiseen.
Resumo:
In this paper we design and develop several filtering strategies for the analysis of data generated by a resonant bar gravitational wave (GW) antenna, with the goal of assessing the presence (or absence) therein of long-duration monochromatic GW signals, as well as the eventual amplitude and frequency of the signals, within the sensitivity band of the detector. Such signals are most likely generated in the fast rotation of slightly asymmetric spinning stars. We develop practical procedures, together with a study of their statistical properties, which will provide us with useful information on the performance of each technique. The selection of candidate events will then be established according to threshold-crossing probabilities, based on the Neyman-Pearson criterion. In particular, it will be shown that our approach, based on phase estimation, presents a better signal-to-noise ratio than does pure spectral analysis, the most common approach.
Resumo:
Introduction: Third molar extraction is the most frequent procedure in oral surgery. The present study evaluates the indication of third molar extraction as established by the primary care dentist (PCD) and the oral surgeon, and compares the justification for extraction with the principal reason for patient consultation. Patients and method: A descriptive study was made of 319 patients subjected to surgical removal of a third molar in the context of the Master of Oral Surgery and Implantology (Barcelona University Dental School, Barcelona, Spain) between July 2004 and March 2005. The following parameters were evaluated: sex, age, molar, type of impaction, position according to the classifications of Pell and Gregory and of Winter, and the reasons justifying extraction. Results: The lower third molars were the most commonly extracted molars (73.7%). A total of 69.6% of the teeth were covered by soft tissues only. Fifty-six percent of the lower molars corresponded to Pell and Gregory Class IIB, while 42.1% were in the vertical position. The most common reason for patient reference to our Service of Oral Surgery on the part of the PCD was prophylactic removal (51.0% versus 46.1% in the case of the oral surgeon). Discussion and conclusions. Our results show prophylaxis to be the principal indication of third molar extraction, followed by orthodontic reasons. Regarding third molars with associated clinical symptoms or signs, infectious disease-including pericoronitis- was the pathology most often observed by the oral surgeon, followed by caries. This order of frequency was seen to invert in the case of third molars referred for extraction by the PCD. A vertical position predominated among the third molars with associated pathology
Resumo:
Puhelinmuistio on yksi matkapuhelimen käytetyimmistä ominaisuuksista. Puhelinmuistion tulee siksi olla kaikissa tilanteissa mahdollisimman nopeasti käytettävissä. Tämä edellyttää puhelinmuistiopalvelimelta tehokkaita tietorakenteita ja lajittelualgoritmeja. Nokian matkapuhelimissa puhelinmuistiopalvelin käyttää hakurakenteena järjestettyjä taulukoita. Työn tavoitteena oli kehittää puhelinmuistiopalvelimen hakutaulukoiden lajittelu mahdollisimman nopeaksi. Useita eri lajittelualgoritmeja vertailtiin ja niiden suoritusaikoja analysoitiin eri tilanteissa. Insertionsort-lajittelualgoritmin todettiin olevan nopein algoritmi lähes järjestyksessä olevien taulukoiden lajitteluun. Analyysin perusteella Quicksort-algoritmi lajittelee nopeimmin satunnaisessa järjestyksessä olevat taulukot. Quicksort-insertionsort –hybridialgoritmin havaittiin olevan paras lajittelualgoritmi puhelinmuistion lajitteluun. Sopivalla parametroinnilla tämä algoritmi on nopea satunnaisessa järjestyksessä olevalle aineistolle. Se kykenee hyödyntämään lajiteltavassa aineistossa valmiina olevaa järjestystä. Algoritmi ei kasvata merkittävästi muistinkulutusta. Uuden algoritmin ansiosta hakutaulukoiden lajittelu nopeutuu parhaimmillaan useita kymmeniä prosentteja.
Resumo:
Background: To evaluate the safety of immediate sequential bilateral cataract extraction (ISBCE) with respect to indications, visual outcomes, complications, benefits and disadvantages. Methods: This is a retrospective review of all ISBCEs performed at Kantonsspital Winterthur, Switzerland, between April 2000 and September 2013. The case notes of 500 eyes of 250 patients were reviewed. Of these 500 eyes, 472 (94.4%) had a straight forward phacoemulsification with posterior chamber intraocular lens implantation; 21 (4.2%) had a planned extracapsular cataract extraction; 4 (0.8%) had an intracapsular cataract extraction and 3 (0.6%) had a combined phacoemulsification with trabeculectomy. Results: Over 66% of eyes achieved improved visual acuity (at least 3 Snellen lines) following ISBCE. Median preoperative best corrected visual acuity (BCVA) was 0.5 LogMAR; the interquartile range was [0.4, 1] LogMAR. At one week control the median BCVA was 0.3 LogMAR, IQR [0.1, 0.5] LogMAR. At one month the median BCVA was 0.15 LogMAR, IQR [0.05, 0.3] (p < 0.01). There were no sight-threatening intraoperative or postoperative complications observed. Conclusions: ISBCE is an effective and safe option with high degree of patient satisfaction. The relative benefits of ISBCE should be balanced against the theoretically enhanced risks.
Resumo:
In forensic investigation of firearm-related cases, determination of the residual amount of volatile compounds remaining inside a cartridge could be useful in estimating the time since its discharge. Published approaches are based on following the decrease of selected target compounds as a function of time by using solid phase micro-extraction (SPME). Naphthalene, as well as an unidentified decomposition product of nitrocellulose (referred to as "TEA2"), are usually employed for this purpose. However, reliability can be brought into question given their high volatility and the low reproducibility of their extracted quantities. In order to identify alternatives and therefore develop improved dating methods, an extensive study on the composition and variability of volatile residues in nine different types of cartridges was carried out. Analysis was performed using headspace sorptive extraction (HSSE), which is a more exhaustive technique compared to SPME. 166 compounds were identified (several of which for the first time), and it was observed that the final compositional characteristics of each residue were strongly dependent on its source. Variability of single identified compounds within and between different types of cartridge, as well as their evolution over time, was also studied. Many explosion products containing up to 4 aromatic rings were found to be globally present in high proportions amongst residues. 27 of them (excluding naphthalene) also presented detectable decreases during the first 24 h. Therefore, they could be used as complementary target analytes in future dating methods.
Resumo:
ABSTRACT Functional genomic analyses require intact RNA; however, Passiflora edulis leaves are rich in secondary metabolites that interfere with RNA extraction primarily by promoting oxidative processes and by precipitating with nucleic acids. This study aimed to analyse three RNA extraction methods, Concert™ Plant RNA Reagent (Invitrogen, Carlsbad, CA, USA), TRIzol® Reagent (Invitrogen) and TRIzol® Reagent (Invitrogen)/ice -commercial products specifically designed to extract RNA, and to determine which method is the most effective for extracting RNA from the leaves of passion fruit plants. In contrast to the RNA extracted using the other 2 methods, the RNA extracted using TRIzol® Reagent (Invitrogen) did not have acceptable A260/A280 and A260/A230 ratios and did not have ideal concentrations. Agarose gel electrophoresis showed a strong DNA band for all of the Concert™ method extractions but not for the TRIzol® and TRIzol®/ice methods. The TRIzol® method resulted in smears during electrophoresis. Due to its low levels of DNA contamination, ideal A260/A280 and A260/A230 ratios and superior sample integrity, RNA from the TRIzol®/ice method was used for reverse transcription-polymerase chain reaction (RT-PCR), and the resulting amplicons were highly similar. We conclude that TRIzol®/ice is the preferred method for RNA extraction for P. edulis leaves.
Resumo:
Diagnosis of community acquired legionella pneumonia (CALP) is currently performed by means of laboratory techniques which may delay diagnosis several hours. To determine whether ANN can categorize CALP and non-legionella community-acquired pneumonia (NLCAP) and be standard for use by clinicians, we prospectively studied 203 patients with community-acquired pneumonia (CAP) diagnosed by laboratory tests. Twenty one clinical and analytical variables were recorded to train a neural net with two classes (LCAP or NLCAP class). In this paper we deal with the problem of diagnosis, feature selection, and ranking of the features as a function of their classification importance, and the design of a classifier the criteria of maximizing the ROC (Receiving operating characteristics) area, which gives a good trade-off between true positives and false negatives. In order to guarantee the validity of the statistics; the train-validation-test databases were rotated by the jackknife technique, and a multistarting procedure was done in order to make the system insensitive to local maxima.
Resumo:
BACKGROUND: HIV surveillance requires monitoring of new HIV diagnoses and differentiation of incident and older infections. In 2008, Switzerland implemented a system for monitoring incident HIV infections based on the results of a line immunoassay (Inno-Lia) mandatorily conducted for HIV confirmation and type differentiation (HIV-1, HIV-2) of all newly diagnosed patients. Based on this system, we assessed the proportion of incident HIV infection among newly diagnosed cases in Switzerland during 2008-2013. METHODS AND RESULTS: Inno-Lia antibody reaction patterns recorded in anonymous HIV notifications to the federal health authority were classified by 10 published algorithms into incident (up to 12 months) or older infections. Utilizing these data, annual incident infection estimates were obtained in two ways, (i) based on the diagnostic performance of the algorithms and utilizing the relationship 'incident = true incident + false incident', (ii) based on the window-periods of the algorithms and utilizing the relationship 'Prevalence = Incidence x Duration'. From 2008-2013, 3'851 HIV notifications were received. Adult HIV-1 infections amounted to 3'809 cases, and 3'636 of them (95.5%) contained Inno-Lia data. Incident infection totals calculated were similar for the performance- and window-based methods, amounting on average to 1'755 (95% confidence interval, 1588-1923) and 1'790 cases (95% CI, 1679-1900), respectively. More than half of these were among men who had sex with men. Both methods showed a continuous decline of annual incident infections 2008-2013, totaling -59.5% and -50.2%, respectively. The decline of incident infections continued even in 2012, when a 15% increase in HIV notifications had been observed. This increase was entirely due to older infections. Overall declines 2008-2013 were of similar extent among the major transmission groups. CONCLUSIONS: Inno-Lia based incident HIV-1 infection surveillance proved useful and reliable. It represents a free, additional public health benefit of the use of this relatively costly test for HIV confirmation and type differentiation.
Resumo:
BACKGROUND: Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM: Our aim was to challenge the validity of these software algorithms. METHODS: We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS: In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION: We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes. Pediatr Pulmonol. 2015; 50:970-977. © 2015 Wiley Periodicals, Inc.
Resumo:
Objective: The goal of the present retrospective study is to describe the distribution of the supernumerary teeth in a population of patients that have been attended at the Public Clinic of the Department of Oral Surgery. Background: Supernumerary teeth and multiple hyperdontia are usually associated with different syndromes, such as Gardner syndrome, or with facial fissures; however, they can appear in patients without any pathology. Their prevalence oscillates to 0.5-3.8% in patients with permanent teeth and to 0.35-0.6% in patients with primary teeth. Patients and Methods: A total of 36,057 clinical histories of patients that were admitted at the clinic between September of 1991 and March of 2003 were revised. The following data were extrapolated: age, sex, number of extracted supernumerary teeth, localization, morphology and type of supernumerary teeth. Consequently, 102 patients were included into the present study. Results: Of the 147 supernumerary teeth identified in the oral cavities of patients 145 were extracted. The most frequent supernumerary teeth identified were mesiodens (46.9%), followed by premolars (24.1%) and fourth molars or distal molars (18%). As for location, 74.5% of the supernumerary teeth were found in the superior maxillary bone and 46.9% of the supernumerary teeth were present in the palatine/lingual area. Heteromorphology was found in two thirds of the supernumerary teeth, with conical shape being the most frequent. Finally, 29.7% of the supernumerary teeth had occlusion with permanent teeth, and mesiodens were the predominating type of supernumerary teeth that showed this feature. Conclusions: Mesiodens very frequently cause retention of permanent incisors, which erupt spontaneously after the extraction of supernumerary teeth, if there is sufficient space in the dental arch and if they conserve the eruptive force. Generally, supernumerary premolars are eumorphic and are casually discovered during radiological exam, if not producing any symptomology.
Resumo:
In this study we used market settlement prices of European call options on stock index futures to extract implied probability distribution function (PDF). The method used produces a PDF of returns of an underlying asset at expiration date from implied volatility smile. With this method, the assumption of lognormal distribution (Black-Scholes model) is tested. The market view of the asset price dynamics can then be used for various purposes (hedging, speculation). We used the so called smoothing approach for implied PDF extraction presented by Shimko (1993). In our analysis we obtained implied volatility smiles from index futures markets (S&P 500 and DAX indices) and standardized them. The method introduced by Breeden and Litzenberger (1978) was then used on PDF extraction. The results show significant deviations from the assumption of lognormal returns for S&P500 options while DAX options mostly fit the lognormal distribution. A deviant subjective view of PDF can be used to form a strategy as discussed in the last section.