992 resultados para Mapping Problem
Resumo:
Dysmenorrhea is common in adolescent years, especially after the onset of ovulatory cycles, usually 2 to 3 years after menarche. Pain and symptoms are responsible for school absenteeism and interruption of sports and social activities. OBJECTIVES: This study aims to measure the prevalence of severe dysmenorrhea and its consequences on adolescent girls in Switzerland. Treatment of dysmenorrhea is discussed and recommendations for clinical practice are given. STUDY DESIGN: Cross sectional survey (SMASH 02) on a nationally representative sample of adolescents (n=7548; 3340 females), aged 16 to 20 years who attended post-mandatory education. A self-administered questionnaire was used to assess the severity of dysmenorrhea and its consequences on daily life pursuit of medical help and medications used. RESULTS: Among 3340 girls, 86.6% suffered from dysmenorrhea-related symptoms: 12.4% described having severe dysmenorrhea and 74.2% moderate dysmenorrhea. Girls with severe dysmenorrhea described heavier consequences on daily activities compared with girls without dysmenorrhea: 47.8% of girls with severe dysmenorrhea reported staying at home and 66.5% declared reducing their sportive activities. Yet, fewer than half have consulted a physician for this complaint and even fewer were treated properly. RECOMMENDATION: The pediatrician has a pivotal role in screening young patients for dysmenorrhea, as well as, educating and effectively treating adolescent girls with menstruation-associated symptoms. Non-steroidal anti-inflammatory drugs are considered the first-line of treatment for dysmenorrhea, and adolescents with symptoms that do not respond to this treatment for 3 menstrual periods should be offered combined oestroprogestative contraception and must be followed up, as non-responders may have an underlying organic pathology. CONCLUSION: Dysmenorrhea is a frequent health problem in adolescent years and adolescent care providers should be able to care for these patients in an efficient way.
Resumo:
A number of experimental methods have been reported for estimating the number of genes in a genome, or the closely related coding density of a genome, defined as the fraction of base pairs in codons. Recently, DNA sequence data representative of the genome as a whole have become available for several organisms, making the problem of estimating coding density amenable to sequence analytic methods. Estimates of coding density for a single genome vary widely, so that methods with characterized error bounds have become increasingly desirable. We present a method to estimate the protein coding density in a corpus of DNA sequence data, in which a ‘coding statistic’ is calculated for a large number of windows of the sequence under study, and the distribution of the statistic is decomposed into two normal distributions, assumed to be the distributions of the coding statistic in the coding and noncoding fractions of the sequence windows. The accuracy of the method is evaluated using known data and application is made to the yeast chromosome III sequence and to C.elegans cosmid sequences. It can also be applied to fragmentary data, for example a collection of short sequences determined in the course of STS mapping.
Resumo:
We address the problem of comparing and characterizing the promoter regions of genes with similar expression patterns. This remains a challenging problem in sequence analysis, because often the promoter regions of co-expressed genes do not show discernible sequence conservation. In our approach, thus, we have not directly compared the nucleotide sequence of promoters. Instead, we have obtained predictions of transcription factor binding sites, annotated the predicted sites with the labels of the corresponding binding factors, and aligned the resulting sequences of labels—to which we refer here as transcription factor maps (TF-maps). To obtain the global pairwise alignment of two TF-maps, we have adapted an algorithm initially developed to align restriction enzyme maps. We have optimized the parameters of the algorithm in a small, but well-curated, collection of human–mouse orthologous gene pairs. Results in this dataset, as well as in an independent much larger dataset from the CISRED database, indicate that TF-map alignments are able to uncover conserved regulatory elements, which cannot be detected by the typical sequence alignments.
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.
Resumo:
Lexical Resources are a critical component for Natural Language Processing applications. However, the high cost of comparing and merging different resources has been a bottleneck to have richer resources with a broad range of potential uses for a significant number of languages.With the objective of reducing cost byeliminating human intervention, we present a new method for automating the merging of resources,with special emphasis in what we call the mapping step. This mapping step, which converts the resources into a common format that allows latter the merging, is usually performed with huge manual effort and thus makes the whole process very costly. Thus, we propose a method to perform this mapping fully automatically. To test our method, we have addressed the merging of two verb subcategorization frame lexica for Spanish, The resultsachieved, that almost replicate human work, demonstrate the feasibility of the approach.
Resumo:
Iowa Vocational Rehabilitation Services, a Division of the State of Iowa Department of Education, in partnership with seven other state agencies, applied for and was awarded funding for “Improving Transition Outcomes for Youth with Disabilities Through the Use of Intermediaries.” This Innovative State Alignment Grant is funded by the Department of Labor, Office of Disability Employment Policy. For clarity and brevity, the Iowa team chose to use “Improving Transition Outcomes” as the project name, thus providing the acronym ITO. Grant funding began October 1, 2003 with the possibility of renewal for five years.
Resumo:
This paper deals with the problem of spatial data mapping. A new method based on wavelet interpolation and geostatistical prediction (kriging) is proposed. The method - wavelet analysis residual kriging (WARK) - is developed in order to assess the problems rising for highly variable data in presence of spatial trends. In these cases stationary prediction models have very limited application. Wavelet analysis is used to model large-scale structures and kriging of the remaining residuals focuses on small-scale peculiarities. WARK is able to model spatial pattern which features multiscale structure. In the present work WARK is applied to the rainfall data and the results of validation are compared with the ones obtained from neural network residual kriging (NNRK). NNRK is also a residual-based method, which uses artificial neural network to model large-scale non-linear trends. The comparison of the results demonstrates the high quality performance of WARK in predicting hot spots, reproducing global statistical characteristics of the distribution and spatial correlation structure.
Resumo:
In the last decades, the globalized competition among cities and regions made them develop new strategies for branding and promoting their territory to attract tourists, investors, companies and residents. Major sports events - such as the Olympic Games, the FIFA World Cup or World and Continental Championships - have played an integral part in these strategies. Believing, with or without evidence, in the capacity of those events to improve the visibility and the economy of the host destination, many cities, regions and even countries have engaged in establishing sports events hosting strategies. The problem of the globalized competition in the sports events "market" is that many cities and regions do not have the resources - either financial, human or in terms of infrastructure - to compete in hosting major sports events. Consequently, many cities or regions have to turn to second-tier sports events. To organise those smaller events means less media coverage and more difficulty in finding sponsors, while the costs - both financial and in terms of services - stay high for the community. This paper analyses how Heritage Sporting Events (HSE) might be an opportunity for cities and regions engaged in sports events hosting strategies. HSE is an emerging concept that to date has been under-researched in the academic literature. Therefore, this paper aims to define the concept of HSE through an exploratory research study. A multidisciplinary literature review reveals two major characteristics of HSEs: the sustainability in the territory and the authenticity of the event constructed through a differentiation process. These characteristics, defined through multiple variables, give us the opportunity to observe the construction process of a sports event into a heritage object. This paper argues that HSEs can be seen as territorial resources that can represent a competitive advantage for host destinations. In conclusion, academics are invited to further research HSEs to better understand their construction process and their impacts on the territory, while local authorities are invited to consider HSEs for the branding and the promotion of their territory.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Structurally segregated and functionally specialized regions of the human cerebral cortex are interconnected by a dense network of cortico-cortical axonal pathways. By using diffusion spectrum imaging, we noninvasively mapped these pathways within and across cortical hemispheres in individual human participants. An analysis of the resulting large-scale structural brain networks reveals a structural core within posterior medial and parietal cerebral cortex, as well as several distinct temporal and frontal modules. Brain regions within the structural core share high degree, strength, and betweenness centrality, and they constitute connector hubs that link all major structural modules. The structural core contains brain regions that form the posterior components of the human default network. Looking both within and outside of core regions, we observed a substantial correspondence between structural connectivity and resting-state functional connectivity measured in the same participants. The spatial and topological centrality of the core within cortex suggests an important role in functional integration.
Resumo:
This was a descriptive, retrospective study, with a quantitative method, with the aim of analyzing the nursing diagnoses contained in the records of children of 0 to 36 months of age who attended infant health nursing consults. A documentary analysis and the cross-mapping technique were used. One hundred eighty-eight different nursing diagnoses were encountered, of which 33 (58.9%) corresponded to diagnoses contained in the Nomenclature of Nursing Diagnoses and Interventions and 23 (41.1%) were derived from ICNP® Version 1.0. Of the 56 nursing diagnoses, 43 (76.8%) were considered to be deviations from normalcy. It was concluded that the infant health nursing consults enabled the identification of situations of normalcy and abnormality, with an emphasis on the diagnoses of deviations from normalcy. Standardized language favors nursing documentation, contributing to the care of the patient and facilitating communication between nurses and other health professionals.
Resumo:
This study aimed at analyzing nipple trauma resulted from breastfeeding based on dermatological approach. Two integrative reviews of literature were conducted, the first related to definitions, classification and evaluation methods of nipple trauma and another about validation studies related to this theme. In the first part were included 20 studies and only one third defined nipple trauma, more than half did not defined the nipple’s injuries reported, and each author showed a particular way to assess the injuries, without consensus. In the second integrative review, no validation study or algorithm related to nipple trauma resulted from breastfeeding was found. This fact demonstrated that the nipple’s injuries mentioned in the first review did not go through validation studies, justifying the lack of consensus identified as far as definition, classification and assessment methods of nipple trauma.
Resumo:
Objective:To identify the nursing care prescribed for patients in risk for pressure ulcer (PU) and to compare those with the Nursing Interventions Classification (NIC) interventions. Method: Cross mapping study conducted in a university hospital. The sample was composed of 219 adult patients hospitalized in clinical and surgical units. The inclusion criteria were: score ≤ 13 in the Braden Scale and one of the nursing diagnoses, Self-Care deficit syndrome, Impaired physical mobility, Impaired tissue integrity, Impaired skin integrity, Risk for impaired skin integrity. The data were collected retrospectively in a nursing prescription system and statistically analyzed by crossed mapping. Result: It was identified 32 different nursing cares to prevent PU, mapped in 17 different NIC interventions, within them: Skin surveillance, Pressure ulcer prevention and Positioning. Conclusion: The cross mapping showed similarities between the prescribed nursing care and the NIC interventions.
Resumo:
Objective: Identifying the prescribed nursing care for hospitalized patients at risk of falls and comparing them with the interventions of the Nursing Interventions Classifications (NIC). Method: A cross-sectional study carried out in a university hospital in southern Brazil. It was a retrospective data collection in the nursing records system. The sample consisted of 174 adult patients admitted to medical and surgical units with the Nursing Diagnosis of Risk for falls. The prescribed care were compared with the NIC interventions by the cross-mapping method. Results: The most prevalent care were the following: keeping the bed rails, guiding patients/family regarding the risks and prevention of falls, keeping the bell within reach of patients, and maintaining patients’ belongings nearby, mapped in the interventions Environmental Management: safety and Fall Prevention. Conclusion: The treatment prescribed in clinical practice was corroborated by the NIC reference.