938 resultados para source analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The mass media are assigned an important role in political campaigns on popular votes. This article asks how the press communicates political issues to citizens during referendum campaigns, and whether some minimal criteria for successful public deliberation are met. The press coverage of all 24 ballot votes on welfare state issues from 1995 to 2004 in Switzerland is examined, distinguishing seven criteria to judge how news coverage compares to idealized notions of the media's role in the democratic process: coverage intensity, time for public deliberation, balance in media coverage, source independence and inclusiveness, substantive coverage, and spatial homogeneity. The results of our quantitative analysis suggest that the press does fulfil these normative requirements to a reasonable extent and that fears about biased or deceitful media treatment of ballot issues are not well-founded. However, some potential for optimizing the coverage of referendum campaigns by the Swiss press does exist

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Työn tarkoituksena oli tutkia sisältö- ja diskurssianalyysin avulla kuinka yritykset viestivät asiakasreferenssejä verkkosivuillaan. Työssä keskityttiin tutkimaan yritysten referenssikuvausten teemoja ja diskursseja, sekä sitä kuinka referenssisuhde rakentuu diskursiivisesti referenssikuvauksissa. Tutkimukseen valittiin kolme suomalaista ICT-alan yritystä: Nokia, TietoEnator ja F-Secure. Aineisto koostuu 140:stä yritysten WWW-sivuilta kerätystä referenssikuvauksesta. Sisältöanalyysin tuloksena havaittiin, että referenssikuvaukset keskittyvät kuvaamaan yksittäisiä tuote- tai projektitoimituksia referenssiasiakkaille kyseisten asiakassuhteiden valossa. Analyysin tuloksena tunnistettiin kolme diskurssia: hyötydiskurssi, sitoutumisen diskurssi sekä teknologisen eksperttiyden diskurssi. Diskurssit paljastavat referenssikuvausten retoriset keinot ja konstruoivat referenssisuhteen ja toimittajan subjektiposition eri näkökulmista. Pääpaino referenssikuvauksissa on toimittajan ratkaisun tuomissa hyödyissä. Diskurssit tuottavat referenssisuhteesta kuvan hyötyjä tuovana ja läheisenä asiakassuhteena, joka tarjoaa väylän ulkopuolisiin kyvykkyyksiin ja teknologioihin. Toimittaja esitetään referenssikuvauksissa diskurssista riippuen hyötyjen tuojana, luotettavana partnerina sekä kokeneena eksperttinä. Referenssiasiakas sen sijaan esitetään vain yhdestä näkökulmasta stereotyyppisesti tärkeänä ja tyytyväisenä asiakkaana.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A case of sustained combustion of a human body that occurred in 2006 in Geneva, Switzerland, is presented. The body of a man was discovered at home and found to have been almost completely incinerated between the knees and the mid-chest, with less damage to the head, arms, lower legs and feet. His dog was also found dead just behind the house door. The external source of ignition was most likely a cigarette or a cigar. The chair in which the man had been sitting was largely consumed while other objects in the room exhibited only a brown oily or greasy coating and were virtually undamaged. Toxicological analyses carried out on the blood from the lower legs showed low levels of desalkylflurazepam. Alcohol concentration was 1.10 per thousand, carboxyhaemoglobin levels were 12% and cyanide concentration was 0.05 mg/L. Toxicological analyses carried out on the dog's blood showed carboxyhaemoglobin levels at 65%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diffusion of mobile telephony began in 1971 in Finland, when the first car phones, called ARP1 were taken to use. Technologies changed from ARP to NMT and later to GSM. The main application of the technology, however, was voice transfer. The birth of the Internet created an open public data network and easy access to other types of computer-based services over networks. Telephones had been used as modems, but the development of the cellular technologies enabled automatic access from mobile phones to Internet. Also other wireless technologies, for instance Wireless LANs, were also introduced. Telephony had developed from analog to digital in fixed networks and allowed easy integration of fixed and mobile networks. This development opened a completely new functionality to computers and mobile phones. It also initiated the merger of the information technology (IT) and telecommunication (TC) industries. Despite the arising opportunity for firms' new competition the applications based on the new functionality were rare. Furthermore, technology development combined with innovation can be disruptive to industries. This research focuses on the new technology's impact on competition in the ICT industry through understanding the strategic needs and alternative futures of the industry's customers. The change speed inthe ICT industry is high and therefore it was valuable to integrate the DynamicCapability view of the firm in this research. Dynamic capabilities are an application of the Resource-Based View (RBV) of the firm. As is stated in the literature, strategic positioning complements RBV. This theoretical framework leads theresearch to focus on three areas: customer strategic innovation and business model development, external future analysis, and process development combining these two. The theoretical contribution of the research is in the development of methodology integrating theories of the RBV, dynamic capabilities and strategic positioning. The research approach has been constructive due to the actual managerial problems initiating the study. The requirement for iterative and innovative progress in the research supported the chosen research approach. The study applies known methods in product development, for instance, innovation process in theGroup Decision Support Systems (GDSS) laboratory and Quality Function Deployment (QFD), and combines them with known strategy analysis tools like industry analysis and scenario method. As the main result, the thesis presents the strategic innovation process, where new business concepts are used to describe the alternative resource configurations and scenarios as alternative competitive environments, which can be a new way for firms to achieve competitive advantage in high-velocity markets. In addition to the strategic innovation process as a result, thestudy has also resulted in approximately 250 new innovations for the participating firms, reduced technology uncertainty and helped strategic infrastructural decisions in the firms, and produced a knowledge-bank including data from 43 ICT and 19 paper industry firms between the years 1999 - 2004. The methods presentedin this research are also applicable to other industries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Belt-drive systems have been and still are the most commonly used power transmission form in various applications of different scale and use. The peculiar features of the dynamics of the belt-drives include highly nonlinear deformation,large rigid body motion, a dynamical contact through a dry friction interface between the belt and pulleys with sticking and slipping zones, cyclic tension of the belt during the operation and creeping of the belt against the pulleys. The life of the belt-drive is critically related on these features, and therefore, amodel which can be used to study the correlations between the initial values and the responses of the belt-drives is a valuable source of information for the development process of the belt-drives. Traditionally, the finite element models of the belt-drives consist of a large number of elements thatmay lead to computational inefficiency. In this research, the beneficial features of the absolute nodal coordinate formulation are utilized in the modeling of the belt-drives in order to fulfill the following requirements for the successful and efficient analysis of the belt-drive systems: the exact modeling of the rigid body inertia during an arbitrary rigid body motion, the consideration of theeffect of the shear deformation, the exact description of the highly nonlinear deformations and a simple and realistic description of the contact. The use of distributed contact forces and high order beam and plate elements based on the absolute nodal coordinate formulation are applied to the modeling of the belt-drives in two- and three-dimensional cases. According to the numerical results, a realistic behavior of the belt-drives can be obtained with a significantly smaller number of elements and degrees of freedom in comparison to the previously published finite element models of belt-drives. The results of theexamples demonstrate the functionality and suitability of the absolute nodal coordinate formulation for the computationally efficient and realistic modeling ofbelt-drives. This study also introduces an approach to avoid the problems related to the use of the continuum mechanics approach in the definition of elastic forces on the absolute nodal coordinate formulation. This approach is applied to a new computationally efficient two-dimensional shear deformable beam element based on the absolute nodal coordinate formulation. The proposed beam element uses a linear displacement field neglecting higher-order terms and a reduced number of nodal coordinates, which leads to fewer degrees of freedom in a finite element.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Worldwide data for cancer survival are scarce. We aimed to initiate worldwide surveillance of cancer survival by central analysis of population-based registry data, as a metric of the effectiveness of health systems, and to inform global policy on cancer control. METHODS: Individual tumour records were submitted by 279 population-based cancer registries in 67 countries for 25·7 million adults (age 15-99 years) and 75 000 children (age 0-14 years) diagnosed with cancer during 1995-2009 and followed up to Dec 31, 2009, or later. We looked at cancers of the stomach, colon, rectum, liver, lung, breast (women), cervix, ovary, and prostate in adults, and adult and childhood leukaemia. Standardised quality control procedures were applied; errors were corrected by the registry concerned. We estimated 5-year net survival, adjusted for background mortality in every country or region by age (single year), sex, and calendar year, and by race or ethnic origin in some countries. Estimates were age-standardised with the International Cancer Survival Standard weights. FINDINGS: 5-year survival from colon, rectal, and breast cancers has increased steadily in most developed countries. For patients diagnosed during 2005-09, survival for colon and rectal cancer reached 60% or more in 22 countries around the world; for breast cancer, 5-year survival rose to 85% or higher in 17 countries worldwide. Liver and lung cancer remain lethal in all nations: for both cancers, 5-year survival is below 20% everywhere in Europe, in the range 15-19% in North America, and as low as 7-9% in Mongolia and Thailand. Striking rises in 5-year survival from prostate cancer have occurred in many countries: survival rose by 10-20% between 1995-99 and 2005-09 in 22 countries in South America, Asia, and Europe, but survival still varies widely around the world, from less than 60% in Bulgaria and Thailand to 95% or more in Brazil, Puerto Rico, and the USA. For cervical cancer, national estimates of 5-year survival range from less than 50% to more than 70%; regional variations are much wider, and improvements between 1995-99 and 2005-09 have generally been slight. For women diagnosed with ovarian cancer in 2005-09, 5-year survival was 40% or higher only in Ecuador, the USA, and 17 countries in Asia and Europe. 5-year survival for stomach cancer in 2005-09 was high (54-58%) in Japan and South Korea, compared with less than 40% in other countries. By contrast, 5-year survival from adult leukaemia in Japan and South Korea (18-23%) is lower than in most other countries. 5-year survival from childhood acute lymphoblastic leukaemia is less than 60% in several countries, but as high as 90% in Canada and four European countries, which suggests major deficiencies in the management of a largely curable disease. INTERPRETATION: International comparison of survival trends reveals very wide differences that are likely to be attributable to differences in access to early diagnosis and optimum treatment. Continuous worldwide surveillance of cancer survival should become an indispensable source of information for cancer patients and researchers and a stimulus for politicians to improve health policy and health-care systems. FUNDING: Canadian Partnership Against Cancer (Toronto, Canada), Cancer Focus Northern Ireland (Belfast, UK), Cancer Institute New South Wales (Sydney, Australia), Cancer Research UK (London, UK), Centers for Disease Control and Prevention (Atlanta, GA, USA), Swiss Re (London, UK), Swiss Cancer Research foundation (Bern, Switzerland), Swiss Cancer League (Bern, Switzerland), and University of Kentucky (Lexington, KY, USA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After the release of the gamma-ray source catalog produced by the Fermi satellite during its first two years of operation, a significant fraction of sources still remain unassociated at lower energies. In addition to well-known high-energy emitters (pulsars, blazars, supernova remnants, etc.), theoretical expectations predict new classes of gamma-ray sources. In particular, gamma-ray emission could be associated with some of the early phases of stellar evolution, but this interesting possibility is still poorly understood. Aims: The aim of this paper is to assess the possibility of the Fermi gamma-ray source 2FGL J0607.5-0618c being associated with the massive star forming region Monoceros R2. Methods: A multi-wavelength analysis of the Monoceros R2 region is carried out using archival data at radio, infrared, X-ray, and gamma-ray wavelengths. The resulting observational properties are used to estimate the physical parameters needed to test the different physical scenarios. Results: We confirm the 2FGL J0607.5-0618c detection with improved confidence over the Fermi two-year catalog. We find that a combined effect of the multiple young stellar objects in Monoceros R2 is a viable picture for the nature of the source.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coffee, a major dietary source of caffeine, is among the most widely consumed beverages in the world and has received considerable attention regarding health risks and benefits. We conducted a genome-wide (GW) meta-analysis of predominately regular-type coffee consumption (cups per day) among up to 91 462 coffee consumers of European ancestry with top single-nucleotide polymorphisms (SNPs) followed-up in ~30 062 and 7964 coffee consumers of European and African-American ancestry, respectively. Studies from both stages were combined in a trans-ethnic meta-analysis. Confirmed loci were examined for putative functional and biological relevance. Eight loci, including six novel loci, met GW significance (log10Bayes factor (BF)>5.64) with per-allele effect sizes of 0.03-0.14 cups per day. Six are located in or near genes potentially involved in pharmacokinetics (ABCG2, AHR, POR and CYP1A2) and pharmacodynamics (BDNF and SLC6A4) of caffeine. Two map to GCKR and MLXIPL genes related to metabolic traits but lacking known roles in coffee consumption. Enhancer and promoter histone marks populate the regions of many confirmed loci and several potential regulatory SNPs are highly correlated with the lead SNP of each. SNP alleles near GCKR, MLXIPL, BDNF and CYP1A2 that were associated with higher coffee consumption have previously been associated with smoking initiation, higher adiposity and fasting insulin and glucose but lower blood pressure and favorable lipid, inflammatory and liver enzyme profiles (P<5 × 10-8).Our genetic findings among European and African-American adults reinforce the role of caffeine in mediating habitual coffee consumption and may point to molecular mechanisms underlying inter-individual variability in pharmacological and health effects of coffee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene turnover rates and the evolution of gene family sizes are important aspects of genome evolution. Here, we use curated sequence data of the major chemosensory gene families from Drosophila-the gustatory receptor, odorant receptor, ionotropic receptor, and odorant-binding protein families-to conduct a comparative analysis among families, exploring different methods to estimate gene birth and death rates, including an ad hoc simulation study. Remarkably, we found that the state-of-the-art methods may produce very different rate estimates, which may lead to disparate conclusions regarding the evolution of chemosensory gene family sizes in Drosophila. Among biological factors, we found that a peculiarity of D. sechellia's gene turnover rates was a major source of bias in global estimates, whereas gene conversion had negligible effects for the families analyzed herein. Turnover rates vary considerably among families, subfamilies, and ortholog groups although all analyzed families were quite dynamic in terms of gene turnover. Computer simulations showed that the methods that use ortholog group information appear to be the most accurate for the Drosophila chemosensory families. Most importantly, these results reveal the potential of rate heterogeneity among lineages to severely bias some turnover rate estimation methods and the need of further evaluating the performance of these methods in a more diverse sampling of gene families and phylogenetic contexts. Using branch-specific codon substitution models, we find further evidence of positive selection in recently duplicated genes, which attests to a nonneutral aspect of the gene birth-and-death process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several methods and approaches for measuring parameters to determine fecal sources of pollution in water have been developed in recent years. No single microbial or chemical parameter has proved sufficient to determine the source of fecal pollution. Combinations of parameters involving at least one discriminating indicator and one universal fecal indicator offer the most promising solutions for qualitative and quantitative analyses. The universal (nondiscriminating) fecal indicator provides quantitative information regarding the fecal load. The discriminating indicator contributes to the identification of a specific source. The relative values of the parameters derived from both kinds of indicators could provide information regarding the contribution to the total fecal load from each origin. It is also essential that both parameters characteristically persist in the environment for similar periods. Numerical analysis, such as inductive learning methods, could be used to select the most suitable and the lowest number of parameters to develop predictive models. These combinations of parameters provide information on factors affecting the models, such as dilution, specific types of animal source, persistence of microbial tracers, and complex mixtures from different sources. The combined use of the enumeration of somatic coliphages and the enumeration of Bacteroides-phages using different host specific strains (one from humans and another from pigs), both selected using the suggested approach, provides a feasible model for quantitative and qualitative analyses of fecal source identification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diplomityössä on käsitelty uudenlaisia menetelmiä riippumattomien komponenttien analyysiin(ICA): Menetelmät perustuvat colligaatioon ja cross-momenttiin. Colligaatio menetelmä perustuu painojen colligaatioon. Menetelmässä on käytetty kahden tyyppisiä todennäköisyysjakaumia yhden sijasta joka perustuu yleiseen itsenäisyyden kriteeriin. Työssä on käytetty colligaatio lähestymistapaa kahdella asymptoottisella esityksellä. Gram-Charlie ja Edgeworth laajennuksia käytetty arvioimaan todennäköisyyksiä näissä menetelmissä. Työssä on myös käytetty cross-momentti menetelmää joka perustuu neljännen asteen cross-momenttiin. Menetelmä on hyvin samankaltainen FastICA algoritmin kanssa. Molempia menetelmiä on tarkasteltu lineaarisella kahden itsenäisen muuttajan sekoituksella. Lähtö signaalit ja sekoitetut matriisit ovattuntemattomia signaali lähteiden määrää lukuunottamatta. Työssä on vertailtu colligaatio menetelmään ja sen modifikaatioita FastICA:an ja JADE:en. Työssä on myös tehty vertailu analyysi suorituskyvyn ja keskusprosessori ajan suhteen cross-momenttiin perustuvien menetelmien, FastICA:n ja JADE:n useiden sekoitettujen parien kanssa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT Dual-trap optical tweezers are often used in high-resolution measurements in single-molecule biophysics. Such measurements can be hindered by the presence of extraneous noise sources, the most prominent of which is the coupling of fluctuations along different spatial directions, which may affect any optical tweezers setup. In this article, we analyze, both from the theoretical and the experimental points of view, the most common source for these couplings in dual-trap optical-tweezers setups: the misalignment of traps and tether. We give criteria to distinguish different kinds of misalignment, to estimate their quantitative relevance and to include them in the data analysis. The experimental data is obtained in a, to our knowledge, novel dual-trap optical-tweezers setup that directly measures forces. In the case in which misalignment is negligible, we provide a method to measure the stiffness of traps and tether based on variance analysis. This method can be seen as a calibration technique valid beyond the linear trap region. Our analysis is then employed to measure the persistence length of dsDNA tethers of three different lengths spanning two orders of magnitude. The effective persistence length of such tethers is shown to decrease with the contour length, in accordance with previous studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämä diplomityö liittyy Spektrikuvien tutkimiseen tilastollisen kuvamallin näkökulmasta. Diplomityön ensimmäisessä osassa tarkastellaan tilastollisten parametrien jakaumien vaikutusta väreihin ja korostumiin erilaisissa valaistusolosuhteissa. Havaittiin, että tilastollisten parametrien väliset suhteet eivät riipu valaistusolosuhteista, mutta riippuvat kuvan häiriöttömyydestä. Ilmeni myös, että korkea huipukkuus saattaa aiheutua värikylläisyydestä. Lisäksi työssä kehitettiin tilastolliseen spektrimalliin perustuvaa tekstuurinyhdistämisalgoritmia. Sillä saavutettiin hyviä tuloksia, kun tilastollisten parametrien väliset riippuvuussuhteet olivat voimassa. Työn toisessa osassa erilaisia spektrikuvia tutkittiin käyttäen itsenäistä komponenttien analyysia (ICA). Seuraavia itsenäiseen komponenttien analyysiin tarkoitettuja algoritmia tarkasteltiin: JADE, kiinteän pisteen ICA ja momenttikeskeinen ICA. Tutkimuksissa painotettiin erottelun laatua. Paras erottelu saavutettiin JADE- algoritmilla, joskin erot muiden algoritmien välillä eivät olleet merkittäviä. Algoritmi jakoi kuvan kahteen itsenäiseen, joko korostuneeseen ja korostumattomaan tai kromaattiseen ja akromaattiseen, komponenttiin. Lopuksi pohditaan huipukkuuden suhdetta kuvan ominaisuuksiin, kuten korostuneisuuteen ja värikylläisyyteen. Työn viimeisessä osassa ehdotetaan mahdollisia jatkotutkimuskohteita.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To develop systems in order to detect Alzheimer’s disease we want to use EEG signals. Available database is raw, so the first step must be to clean signals properly. We propose a new way of ICA cleaning on a database recorded from patients with Alzheimer's disease (mildAD, early stage). Two researchers visually inspected all the signals (EEG channels), and each recording's least corrupted (artefact-clean) continuous 20 sec interval were chosen for the analysis. Each trial was then decomposed using ICA. Sources were ordered using a kurtosis measure, and the researchers cleared up to seven sources per trial corresponding to artefacts (eye movements, EMG corruption, EKG, etc), using three criteria: (i) Isolated source on the scalp (only a few electrodes contribute to the source), (ii) Abnormal wave shape (drifts, eye blinks, sharp waves, etc.), (iii) Source of abnormally high amplitude ( �100 �V). We then evaluated the outcome of this cleaning by means of the classification of patients using multilayer perceptron neural networks. Results are very satisfactory and performance is increased from 50.9% to 73.1% correctly classified data using ICA cleaning procedure.