859 resultados para Business enterprises -- Electronic data processing -- Study and teaching (Higher) -- Chile
Resumo:
Morphea, granuloma annulare (GA) and lichen sclerosus et atrophicans (LSA) have also been suggested to be linked to Borrelia infection. Previous studies based on serologic data or detection of Borrelia by immunohistochemistry and polymerase chain reaction (PCR) reported contradictory results. Thus, we examined skin biopsies of morphea, GA and LSA by PCR to assess the prevalence of Borrelia DNA in an endemic area and to compare our results with data in the literature.
Resumo:
Java Enterprise Applications (JEAs) are complex software systems written using multiple technologies. Moreover they are usually distributed systems and use a database to deal with persistence. A particular problem that appears in the design of these systems is the lack of a rich business model. In this paper we propose a technique to support the recovery of such rich business objects starting from anemic Data Transfer Objects (DTOs). Exposing the code duplications in the application's elements using the DTOs we suggest which business logic can be moved into the DTOs from the other classes.
Resumo:
Applying location-focused data protection law within the context of a location-agnostic cloud computing framework is fraught with difficulties. While the Proposed EU Data Protection Regulation has introduced a lot of changes to the current data protection framework, the complexities of data processing in the cloud involve various layers and intermediaries of actors that have not been properly addressed. This leaves some gaps in the regulation when analyzed in cloud scenarios. This paper gives a brief overview of the relevant provisions of the regulation that will have an impact on cloud transactions and addresses the missing links. It is hoped that these loopholes will be reconsidered before the final version of the law is passed in order to avoid unintended consequences.
Resumo:
AIMS This study's objective is to assess the safety of non-therapeutic atomoxetine exposures reported to the US National Poison Database System (NPDS). METHODS This is a retrospective database study of non-therapeutic single agent ingestions of atomoxetine in children and adults reported to the NPDS between 2002 and 2010. RESULTS A total of 20 032 atomoxetine exposures were reported during the study period, and 12 370 of these were single agent exposures. The median age was 9 years (interquartile range 3, 14), and 7380 were male (59.7%). Of the single agent exposures, 8813 (71.2%) were acute exposures, 3315 (26.8%) were acute-on-chronic, and 166 (1.3%) were chronic. In 10 608 (85.8%) cases, exposure was unintentional, in 1079 (8.7%) suicide attempts, and in 629 (5.1%) cases abuse. Of these cases, 3633 (29.4 %) were managed at health-care facilities. Acute-on-chronic exposure was associated with an increased risk of a suicidal reason for exposure compared with acute ingestions (odds ratio 1.44, 95% confidence interval 1.26-1.65). Most common clinical effects were drowsiness or lethargy (709 cases; 5.7%), tachycardia (555; 4.5%), and nausea (388; 3.1%). Major toxicity was observed in 21 cases (seizures in nine (42.9%), tachycardia in eight (38.1%), coma in six (28.6%), and ventricular dysrhythmia in one case (4.8%)). CONCLUSIONS Non-therapeutic atomoxetine exposures were largely safe, but seizures were rarely observed.
Resumo:
Background: The US has higher rates of teen births and sexually transmitted infections (STI) than other developed countries. Texas youth are disproportionately impacted. Purpose: To review local, state, and national data on teens’ engagement in sexual risk behaviors to inform policy and practice related to teen sexual health. Methods: 2009 middle school and high school Youth Risk Behavior Survey (YRBS) data, and data from All About Youth, a middle school study conducted in a large urban school district in Texas, were analyzed to assess the prevalence of sexual initiation, including the initiation of non-coital sex, and the prevalence of sexual risk behaviors among Texas and US youth. Results: A substantial proportion of middle and high school students are having sex. Sexual initiation begins as early as 6th grade and increases steadily through 12th grade with almost two-thirds of high school seniors being sexually experienced. Many teens are not protecting themselves from unintended pregnancy or STIs – nationally, 80% and 39% of high school students did not use birth control pills or a condom respectively the last time they had sex. Many middle and high school students are engaging in oral and anal sex, two behaviors which increase the risk of contracting an STI and HIV. In Texas, an estimated 689,512 out of 1,327,815 public high school students are sexually experienced – over half (52%) of the total high school population. Texas students surpass their US peers in several sexual risk behaviors including number of lifetime sexual partners, being currently sexually active, and not using effective methods of birth control or dual protection when having sex. They are also less likely to receive HIV/AIDS education in school. Conclusion: Changes in policy and practice, including implementation of evidence-based sex education programs in middle and high schools and increased access to integrated, teen-friendly sexual and reproductive health services, are urgently needed at the state and national levels to address these issues effectively.
Resumo:
The past 1500 years provide a valuable opportunity to study the response of the climate system to external forcings. However, the integration of paleoclimate proxies with climate modeling is critical to improving the understanding of climate dynamics. In this paper, a climate system model and proxy records are therefore used to study the role of natural and anthropogenic forcings in driving the global climate. The inverse and forward approaches to paleoclimate data–model comparison are applied, and sources of uncertainty are identified and discussed. In the first of two case studies, the climate model simulations are compared with multiproxy temperature reconstructions. Robust solar and volcanic signals are detected in Southern Hemisphere temperatures, with a possible volcanic signal detected in the Northern Hemisphere. The anthropogenic signal dominates during the industrial period. It is also found that seasonal and geographical biases may cause multiproxy reconstructions to overestimate the magnitude of the long-term preindustrial cooling trend. In the second case study, the model simulations are compared with a coral δ18O record from the central Pacific Ocean. It is found that greenhouse gases, solar irradiance, and volcanic eruptions all influence the mean state of the central Pacific, but there is no evidence that natural or anthropogenic forcings have any systematic impact on El Niño–Southern Oscillation. The proxy climate relationship is found to change over time, challenging the assumption of stationarity that underlies the interpretation of paleoclimate proxies. These case studies demonstrate the value of paleoclimate data–model comparison but also highlight the limitations of current techniques and demonstrate the need to develop alternative approaches.
Resumo:
INTRODUCTION HIV-infected pregnant women are very likely to engage in HIV medical care to prevent transmission of HIV to their newborn. After delivery, however, childcare and competing commitments might lead to disengagement from HIV care. The aim of this study was to quantify loss to follow-up (LTFU) from HIV care after delivery and to identify risk factors for LTFU. METHODS We used data on 719 pregnancies within the Swiss HIV Cohort Study from 1996 to 2012 and with information on follow-up visits available. Two LTFU events were defined: no clinical visit for >180 days and no visit for >360 days in the year after delivery. Logistic regression analysis was used to identify risk factors for a LTFU event after delivery. RESULTS Median maternal age at delivery was 32 years (IQR 28-36), 357 (49%) women were black, 280 (39%) white, 56 (8%) Asian and 4% other ethnicities. One hundred and seven (15%) women reported any history of IDU. The majority (524, 73%) of women received their HIV diagnosis before pregnancy, most of those (413, 79%) had lived with diagnosed HIV longer than three years and two-thirds (342, 65%) were already on antiretroviral therapy (ART) at time of conception. Of the 181 women diagnosed during pregnancy by a screening test, 80 (44%) were diagnosed in the first trimester, 67 (37%) in the second and 34 (19%) in the third trimester. Of 357 (69%) women who had been seen in HIV medical care during three months before conception, 93% achieved an undetectable HIV viral load (VL) at delivery. Of 62 (12%) women with the last medical visit more than six months before conception, only 72% achieved an undetectable VL (p=0.001). Overall, 247 (34%) women were LTFU over 180 days in the year after delivery and 86 (12%) women were LTFU over 360 days with 43 (50%) of those women returning. Being LTFU for 180 days was significantly associated with history of intravenous drug use (aOR 1.73, 95% CI 1.09-2.77, p=0.021) and not achieving an undetectable VL at delivery (aOR 1.79, 95% CI 1.03-3.11, p=0.040) after adjusting for maternal age, ethnicity, time of HIV diagnosis and being on ART at conception. CONCLUSIONS Women with a history of IDU and women with a detectable VL at delivery were more likely to be LTFU after delivery. This is of concern regarding their own health, as well as risk for sexual partners and subsequent pregnancies. Further strategies should be developed to enhance retention in medical care beyond pregnancy.
Resumo:
A feasibility study by Pail et al. (Can GOCE help to improve temporal gravity field estimates? In: Ouwehand L (ed) Proceedings of the 4th International GOCE User Workshop, ESA Publication SP-696, 2011b) shows that GOCE (‘Gravity field and steady-state Ocean Circulation Explorer’) satellite gravity gradiometer (SGG) data in combination with GPS derived orbit data (satellite-to-satellite tracking: SST-hl) can be used to stabilize and reduce the striping pattern of a bi-monthly GRACE (‘Gravity Recovery and Climate Experiment’) gravity field estimate. In this study several monthly (and bi-monthly) combinations of GRACE with GOCE SGG and GOCE SST-hl data on the basis of normal equations are investigated. Our aim is to assess the role of the gradients (solely) in the combination and whether already one month of GOCE observations provides sufficient data for having an impact in the combination. The estimation of clean and stable monthly GOCE SGG normal equations at high resolution ( > d/o 150) is found to be difficult, and the SGG component, solely, does not show significant added value to monthly and bi-monthly GRACE gravity fields. Comparisons of GRACE-only and combined monthly and bi-monthly solutions show that the striping pattern can only be reduced when using both GOCE observation types (SGG, SST-hl), and mainly between d/o 45 and 60.
Resumo:
Accurate rainfall data are the key input parameter for modelling river discharge and soil loss. Remote areas of Ethiopia often lack adequate precipitation data and where these data are available, there might be substantial temporal or spatial gaps. To counter this challenge, the Climate Forecast System Reanalysis (CFSR) of the National Centers for Environmental Prediction (NCEP) readily provides weather data for any geographic location on earth between 1979 and 2014. This study assesses the applicability of CFSR weather data to three watersheds in the Blue Nile Basin in Ethiopia. To this end, the Soil and Water Assessment Tool (SWAT) was set up to simulate discharge and soil loss, using CFSR and conventional weather data, in three small-scale watersheds ranging from 112 to 477 ha. Calibrated simulation results were compared to observed river discharge and observed soil loss over a period of 32 years. The conventional weather data resulted in very good discharge outputs for all three watersheds, while the CFSR weather data resulted in unsatisfactory discharge outputs for all of the three gauging stations. Soil loss simulation with conventional weather inputs yielded satisfactory outputs for two of three watersheds, while the CFSR weather input resulted in three unsatisfactory results. Overall, the simulations with the conventional data resulted in far better results for discharge and soil loss than simulations with CFSR data. The simulations with CFSR data were unable to adequately represent the specific regional climate for the three watersheds, performing even worse in climatic areas with two rainy seasons. Hence, CFSR data should not be used lightly in remote areas with no conventional weather data where no prior analysis is possible.
Resumo:
The purpose of this study was to investigate a selection of children's historical nonfiction literature for evidence of coherence. Although research has been conducted on coherence of textbook material and its influences on comprehension there has been limited study on coherence in children's nonfiction literature. Generally, textual coherence has been seen as critical in the comprehensibility of content area textbooks because it concerns the unity of connections among ideas and information. Disciplinary coherence concerns the extent to which authors of historical text show readers how historians think and write. Since young readers are apprentices in learning historical content and conventions of historical thinking, evidence of disciplinary coherence is significant in nonfiction literature for young readers. The sample of the study contained 32 books published between 1989 and 2000 ranging in length from less than 90 pages to more than 150 pages. Content analysis was the quantitative research technique used to measure 84 variables of textual and disciplinary coherence in three passages of each book, as proportions of the total number of words for each book. Reliability analyses and an examination of 750 correlations showed the extent to which variables were related in the books. Three important findings emerged from the study that should be considered in the selection and use of children's historical nonfiction literature in classrooms. First, characteristics of coherence are significantly related together in high quality nonfiction literature. Second, shorter books have a higher proportion of textual coherence than longer books as measured in three passages. Third, presence of the author is related to characteristics of coherence throughout the books. The findings show that nonfiction literature offers students content that researchers have found textbooks lack. Both younger and older students have the opportunity to learn the conventions of historical thinking as they learn content through nonfiction literature. Further, the children's literature, represented in the Orbis Pictus list, shows students that authors select, interpret, and question information, and give other interpretations. The implications of the study for teaching history, teacher preparation in content and literacy, school practices, children's librarians, and publishers of children's nonfiction are discussed.
Resumo:
The infant mortality rate (IMR) is considered to be one of the most important indices of a country's well-being. Countries around the world and other health organizations like the World Health Organization are dedicating their resources, knowledge and energy to reduce the infant mortality rates. The well-known Millennium Development Goal 4 (MDG 4), whose aim is to archive a two thirds reduction of the under-five mortality rate between 1990 and 2015, is an example of the commitment. ^ In this study our goal is to model the trends of IMR between the 1950s to 2010s for selected countries. We would like to know how the IMR is changing overtime and how it differs across countries. ^ IMR data collected over time forms a time series. The repeated observations of IMR time series are not statistically independent. So in modeling the trend of IMR, it is necessary to account for these correlations. We proposed to use the generalized least squares method in general linear models setting to deal with the variance-covariance structure in our model. In order to estimate the variance-covariance matrix, we referred to the time-series models, especially the autoregressive and moving average models. Furthermore, we will compared results from general linear model with correlation structure to that from ordinary least squares method without taking into account the correlation structure to check how significantly the estimates change.^
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
Due to its strong gradient in salinity and small temperature gradient the Mediterranean provides an ideal setting to study the impact of salinity on the incorporation of Mg into foraminiferal tests. We have investigated tests of Globorotalia inflata and Globigerina bulloides in plankton tow and core top samples from the Western Mediterranean using ICP-OES for bulk analyses and LA-ICP-MS for analyses of individual chambers in single specimens. Mg/Ca observed in G. inflata are consistent with existing calibrations, whereas G. bulloides had significantly higher Mg/Ca than predicted, particularly in core top samples from the easterly stations. Scanning Electron Microscopy and Laser Ablation ICP-MS revealed secondary overgrowths on some tests, which could explain the observed high core top Mg/Ca. We suggest that the Mediterranean intermediate and deep water supersaturated with respect to calcite cause these overgrowths and therefore increased bulk Mg/Ca. However, the different species are influenced by diagenesis to different degrees probably due to different test morphologies. Our results provide new perspectives on reported anomalously high Mg/Ca in sedimentary foraminifera and the applicability of the Mg/Ca paleothermometry in high salinity settings, by showing that (1) part of the signal is generated by precipitation of inorganic calcite on the foraminifer test due to increased calcite saturation state of the water and (2) species with high surface-to-volume shell surfaces are potentially more affected by secondary Mg-rich calcite encrustation.
Resumo:
In the present study, we report the results of comprehensive amino acid (AA) analyses of four Indian lakes from different climate regimes. We focus on the investigation of sediment cores retrieved from the lakes but data of modern sediment as well as vascular plant, soil, and suspended particulate matter samples from individual lakes are also presented. Commonly used degradation and organic matter source indices are tested for their applicability to the lake sediments, and we discuss potential reasons for possible limitations. A principal component analysis including the monomeric AA composition of organic matter of all analysed samples indicates that differences in organic matter sources and the environmental properties of the individual lakes are responsible for the major variability in monomeric AA distribution of the different samples. However, the PCA also gives a factor that most probably separates the samples according to their state of organic matter degradation. Using the factor loadings of the individual AA monomers, we calculate a lake sediment degradation index (LI) that might be applicable to other palaeo-lake investigations.
Resumo:
Monitoring the impact of sea storms on coastal areas is fundamental to study beach evolution and the vulnerability of low-lying coasts to erosion and flooding. Modelling wave runup on a beach is possible, but it requires accurate topographic data and model tuning, that can be done comparing observed and modeled runup. In this study we collected aerial photos using an Unmanned Aerial Vehicle after two different swells on the same study area. We merged the point cloud obtained with photogrammetry with multibeam data, in order to obtain a complete beach topography. Then, on each set of rectified and georeferenced UAV orthophotos, we identified the maximum wave runup for both events recognizing the wet area left by the waves. We then used our topography and numerical models to simulate the wave runup and compare the model results to observed values during the two events. Our results highlight the potential of the methodology presented, which integrates UAV platforms, photogrammetry and Geographic Information Systems to provide faster and cheaper information on beach topography and geomorphology compared with traditional techniques without losing in accuracy. We use the results obtained from this technique as a topographic base for a model that calculates runup for the two swells. The observed and modeled runups are consistent, and open new directions for future research.