949 resultados para answer errors
Resumo:
The protein lysate array is an emerging technology for quantifying the protein concentration ratios in multiple biological samples. It is gaining popularity, and has the potential to answer questions about post-translational modifications and protein pathway relationships. Statistical inference for a parametric quantification procedure has been inadequately addressed in the literature, mainly due to two challenges: the increasing dimension of the parameter space and the need to account for dependence in the data. Each chapter of this thesis addresses one of these issues. In Chapter 1, an introduction to the protein lysate array quantification is presented, followed by the motivations and goals for this thesis work. In Chapter 2, we develop a multi-step procedure for the Sigmoidal models, ensuring consistent estimation of the concentration level with full asymptotic efficiency. The results obtained in this chapter justify inferential procedures based on large-sample approximations. Simulation studies and real data analysis are used to illustrate the performance of the proposed method in finite-samples. The multi-step procedure is simpler in both theory and computation than the single-step least squares method that has been used in current practice. In Chapter 3, we introduce a new model to account for the dependence structure of the errors by a nonlinear mixed effects model. We consider a method to approximate the maximum likelihood estimator of all the parameters. Using the simulation studies on various error structures, we show that for data with non-i.i.d. errors the proposed method leads to more accurate estimates and better confidence intervals than the existing single-step least squares method.
Resumo:
To determine the prevalence of refractive errors in the public and private school system in the city of Natal, Northeastern Brazil. Methods: Refractometry was performed on both eyes of 1,024 randomly selected students, enrolled in the 2001 school year and the data were evaluated by the SPSS Data Editor 10.0. Ametropia was divided into: 1- from 0.1 to 0.99 diopter (D); 2- 1.0 to 2.99D; 3- 3.00 to 5.99D and 4- 6D or greater. Astigmatism was regrouped in: I- with-the-rule (axis from 0 to 30 and 150 to 180 degrees), II- against-the-rule (axis between 60 and 120 degrees) and III- oblique (axis between > 30 and < 60 and >120 and <150 degrees). The age groups were categorized as follows, in: 1- 5 to 10 years, 2- 11 to 15 years, 3- 16 to 20 years, 4- over 21 years. Results: Among refractive errors, hyperopia was the most common with 71%, followed by astigmatism (34%) and myopia (13.3%). Of the students with myopia and hyperopia, 48.5% and 34.1% had astigmatism, respectively. With respect to diopters, 58.1% of myopic students were in group 1, and 39% distributed between groups 2 and 3. Hyperopia were mostly found in group 1 (61.7%) as well as astigmatism (70.6%). The association of the astigmatism axes of both eyes showed 92.5% with axis with-the-rule in both eyes, while the percentage for those with axis againstthe- rule was 82.1% and even lower for the oblique axis (50%). Conclusion: The results found differed from those of most international studies, mainly from the Orient, which pointed to myopia as the most common refractive error, and corroborates the national ones, with the majority being hyperopia
Resumo:
This thesis studies the field of asset price bubbles. It is comprised of three independent chapters. Each of these chapters either directly or indirectly analyse the existence or implications of asset price bubbles. The type of bubbles assumed in each of these chapters is consistent with rational expectations. Thus, the kind of price bubbles investigated here are known as rational bubbles in the literature. The following describes the three chapters. Chapter 1: This chapter attempts to explain the recent US housing price bubble by developing a heterogeneous agent endowment economy asset pricing model with risky housing, endogenous collateral and defaults. Investment in housing is subject to an idiosyncratic risk and some mortgages are defaulted in equilibrium. We analytically derive the leverage or the endogenous loan to value ratio. This variable comes from a limited participation constraint in a one period mortgage contract with monitoring costs. Our results show that low values of housing investment risk produces a credit easing effect encouraging excess leverage and generates credit driven rational price bubbles in the housing good. Conversely, high values of housing investment risk produces a credit crunch characterized by tight borrowing constraints, low leverage and low house prices. Furthermore, the leverage ratio was found to be procyclical and the rate of defaults countercyclical consistent with empirical evidence. Chapter 2: It is widely believed that financial assets have considerable persistence and are susceptible to bubbles. However, identification of this persistence and potential bubbles is not straightforward. This chapter tests for price bubbles in the United States housing market accounting for long memory and structural breaks. The intuition is that the presence of long memory negates price bubbles while the presence of breaks could artificially induce bubble behaviour. Hence, we use procedures namely semi-parametric Whittle and parametric ARFIMA procedures that are consistent for a variety of residual biases to estimate the value of the long memory parameter, d, of the log rent-price ratio. We find that the semi-parametric estimation procedures robust to non-normality and heteroskedasticity errors found far more bubble regions than parametric ones. A structural break was identified in the mean and trend of all the series which when accounted for removed bubble behaviour in a number of regions. Importantly, the United States housing market showed evidence for rational bubbles at both the aggregate and regional levels. In the third and final chapter, we attempt to answer the following question: To what extend should individuals participate in the stock market and hold risky assets over their lifecycle? We answer this question by employing a lifecycle consumption-portfolio choice model with housing, labour income and time varying predictable returns where the agents are constrained in the level of their borrowing. We first analytically characterize and then numerically solve for the optimal asset allocation on the risky asset comparing the return predictability case with that of IID returns. We successfully resolve the puzzles and find equity holding and participation rates close to the data. We also find that return predictability substantially alter both the level of risky portfolio allocation and the rate of stock market participation. High factor (dividend-price ratio) realization and high persistence of factor process indicative of stock market bubbles raise the amount of wealth invested in risky assets and the level of stock market participation, respectively. Conversely, rare disasters were found to bring down these rates, the change being severe for investors in the later years of the life-cycle. Furthermore, investors following time varying returns (return predictability) hedged background risks significantly better than the IID ones.
Resumo:
Asari (= Manila) clam, Ruditapes philippinarum, is the second bivalve mollusc in terms of production in the world and, in many coastal areas, can beget important socio-economic issues. In Europe, this species was introduced after 1973. In Arcachon Bay, after a decade of aquaculture attempt, Asari clam rapidly constituted neo-naturalized population which is now fished. However, recent studies emphasized the decline of population and individual performances. In the framework of a national project (REPAMEP), some elements of fitness, stressors and responses in Arcachon bay were measured and compared to international data (41 publications, 9 countries). The condition index (CI=flesh weight/shell weight) was the lowest among all compared sites. Variation in average Chla concentration explained 30% of variation of CI among different areas. Among potential diseases, perkinsosis was particularly prevalent in Arcachon Bay, with high abundance, and Asari clams underwent Brown Muscle Disease, a pathology strictly restricted to this lagoon. Overall element contamination was relatively low, although arsenic, cobalt, nickel and chromium displayed higher values than in other ecosystems where Asari clam is exploited. Finally, total hemocyte count (THC) of Asari clam in Arcachon Bay, related to the immune system activity, exhibited values that were also under what is generally observed elsewhere. In conclusion, this study, with all reserves due to heterogeneity of available data, suggest that the particularly low fitness of Asari clam in Arcachon Bay is due to poor trophic condition, high prevalence and intensity of a disease (perkinsosis), moderate inorganic contamination, and poor efficiency of the immune system.
Resumo:
As condições de ambiente térmico e aéreo, no interior de instalações para animais, alteram-se durante o dia, devido à influência do ambiente externo. Para que análises estatísticas e geoestatísticas sejam representativas, uma grande quantidade de pontos distribuídos espacialmente na área da instalação deve ser monitorada. Este trabalho propõe que a variação no tempo das variáveis ambientais de interesse para a produção animal, monitoradas no interior de instalações para animais, pode ser modelada com precisão a partir de registros discretos no tempo. O objetivo deste trabalho foi desenvolver um método numérico para corrigir as variações temporais dessas variáveis ambientais, transformando os dados para que tais observações independam do tempo gasto durante a aferição. O método proposto aproximou os valores registrados com retardos de tempo aos esperados no exato momento de interesse, caso os dados fossem medidos simultaneamente neste momento em todos os pontos distribuídos espacialmente. O modelo de correção numérica para variáveis ambientais foi validado para o parâmetro ambiental temperatura do ar, sendo que os valores corrigidos pelo método não diferiram pelo teste Tukey, a 5% de probabilidade dos valores reais registrados por meio de dataloggers.
Resumo:
The preparation and administration of medications is one of the most common and relevant functions of nurses, demanding great responsibility. Incorrect administration of medication, currently constitutes a serious problem in health services, and is considered one of the main adverse effects suffered by hospitalized patients. Objectives: Identify the major errors in the preparation and administration of medication by nurses in hospitals and know what factors lead to the error occurred in the preparation and administration of medication. Methods: A systematic review of the literature. Deined as inclusion criteria: original scientiic papers, complete, published in the period 2011 to May 2016, the SciELO and LILACS databases, performed in a hospital environment, addressing errors in preparation and administration of medication by nurses and in Portuguese language. After application of the inclusion criteria obtained a sample of 7 articles. Results: The main errors identiied in the pr eparation and administration of medication were wrong dose 71.4%, wrong time 71.4%, 57.2% dilution inadequate, incorrect selection of the patient 42.8% and 42.8% via inadequate. The factors that were most commonly reported by the nursing staff, as the cause of the error was the lack of human appeal 57.2%, inappropriate locations for the preparation of medication 57.2%, the presence of noise and low brightness in preparation location 57, 2%, professionals untrained 42.8%, fatigue and stress 42.8% and inattention 42.8%. Conclusions: The literature shows a high error rate in the preparation and administration of medication for various reasons, making it important that preventive measures of this occurrence are implemented.
Resumo:
We provide a comprehensive study of out-of-sample forecasts for the EUR/USD exchange rate based on multivariate macroeconomic models and forecast combinations. We use profit maximization measures based on directional accuracy and trading strategies in addition to standard loss minimization measures. When comparing predictive accuracy and profit measures, data snooping bias free tests are used. The results indicate that forecast combinations, in particular those based on principal components of forecasts, help to improve over benchmark trading strategies, although the excess return per unit of deviation is limited.
Resumo:
Ph.D. in the Faculty of Business Administration
Resumo:
[Sin resumen]
Resumo:
Rates of survival of victims of sudden cardiac arrest (SCA) using cardio pulmonary resuscitation (CPR) have shown little improvement over the past three decades. Since registered nurses (RNs) comprise the largest group of healthcare providers in U.S. hospitals, it is essential that they are competent in performing the four primary measures (compression, ventilation, medication administration, and defibrillation) of CPR in order to improve survival rates of SCA patients. The purpose of this experimental study was to test a color-coded SMOCK system on:1) time to implement emergency patient care measures 2) technical skills performance 3) number of medical errors, and 4) team performance during simulated CPR exercises. The study sample was 260 RNs (M 40 years, SD=11.6) with work experience as an RN (M 7.25 years, SD=9.42).Nurses were allocated to a control or intervention arm consisting of 20 groups of 5-8 RNs per arm for a total of 130 RNs in each arm. Nurses in each study arm were given clinical scenarios requiring emergency CPR. Nurses in the intervention group wore different color labeled aprons (smocks) indicating their role assignment (medications, ventilation, compression, defibrillation, etc) on the code team during CPR. Findings indicated that the intervention using color-labeled smocks for pre-assigned roles had a significant effect on the time nurses started compressions (t=3.03, p=0.005), ventilations (t=2.86, p=0.004) and defibrillations (t=2.00, p=.05) when compared to the controls using the standard of care. In performing technical skills, nurses in the intervention groups performed compressions and ventilations significantly better than those in the control groups. The control groups made significantly (t=-2.61, p=0.013) more total errors (7.55 SD 1.54) than the intervention group (5.60, SD 1.90). There were no significant differences in team performance measures between the groups. Study findings indicate use of colored labeled smocks during CPR emergencies resulted in: shorter times to start emergency CPR; reduced errors; more technical skills completed successfully; and no differences in team performance.
Resumo:
Students with specific learning disabilities (SLD) typically learn less history content than their peers without disabilities and show fewer learning gains. Even when they are provided with the same instructional strategies, many students with SLD struggle to grasp complex historical concepts and content area vocabulary. Many strategies involving technology have been used in the past to enhance learning for students with SLD in history classrooms. However, very few studies have explored the effectiveness of emerging mobile technology in K-12 history classrooms. ^ This study investigated the effects of mobile devices (iPads) as an active student response (ASR) system on the acquisition of U.S. history content of middle school students with SLD. An alternating treatments single subject design was used to compare the effects of two interventions. There were two conditions and a series of pretest probesin this study. The conditions were: (a) direct instruction and studying from handwritten notes using the interactive notebook strategy and (b) direct instruction and studying using the Quizlet App on the iPad. There were three dependent variables in this study: (a) percent correct on tests, (b) rate of correct responses per minute, and (c) rate of errors per minute. ^ A comparative analysis suggested that both interventions (studying from interactive notes and studying using Quizlet on the iPad) had varying degrees of effectiveness in increasing the learning gains of students with SLD. In most cases, both interventions were equally effective. During both interventions, all of the participants increased their percentage correct and increased their rate of correct responses. Most of the participants decreased their rate of errors. ^ The results of this study suggest that teachers of students with SLD should consider a post lesson review in the form of mobile devices as an ASR system or studying from handwritten notes paired with existing evidence-based practices to facilitate students’ knowledge in U.S. history. Future research should focus on the use of other interactive applications on various mobile operating platforms, on other social studies subjects, and should explore various testing formats such as oral question-answer and multiple choice. ^
Resumo:
Introduction: Since 2005, the workload of community pharmacists in England has increased with a concomitant increase in stress and work pressure. However, it is unclear how these factors are impacting on the ability of community pharmacists to ensure accuracy during the dispensing process. This research seeks to extend our understanding of the nature, outcome, and predictors of dispensing errors. Methodology: A retrospective analysis of a purposive sample of incident report forms (IRFs) from the database of a pharmacist indemnity insurance provider was conducted. Data collected included; type of error, degree of harm caused, pharmacy and pharmacist demographics, and possible contributory factors. Results: In total, 339 files from UK community pharmacies were retrieved from the database. The files dated from June 2006 to November 2011. Incorrect item (45.1%, n = 153/339) followed by incorrect strength (24.5%, n = 83/339) were the most common forms of error. Almost half (41.6%, n = 147/339) of the patients suffered some form of harm ranging from minor harm (26.7%, n = 87/339) to death (0.3%, n = 1/339). Insufficient staff (51.6%, n = 175/339), similar packaging (40.7%, n = 138/339) and the pharmacy being busier than normal (39.5%, n = 134/339) were identified as key contributory factors. Cross-tabular analysis against the final accuracy check variable revealed significant association between the pharmacy location (P < 0.024), dispensary layout (P < 0.025), insufficient staff (P < 0.019), and busier than normal (P < 0.005) variables. Conclusion: The results provide an overview of some of the individual, organisational and technical factors at play at the time of a dispensing error and highlight the need to examine further the relationships between these factors and dispensing error occurrence.
Resumo:
Human radiosensitivity is a quantitative trait that is generally subject to binomial distribution. Individual radiosensitivity, however, may deviate significantly from the mean (by 2-3 standard deviations). Thus, the same dose of radiation may result in different levels of genotoxic damage (commonly measured as chromosome aberration rates) in different individuals. There is significant genetic component in individual radiosensitivity. It is related to carriership of variant alleles of various single-nucleotide polymorphisms (most of these in genes coding for proteins functioning in DNA damage identification and repair); carriership of different number of alleles producing cumulative effects; amplification of gene copies coding for proteins responsible for radioresistance, mobile genetic elements, and others. Among the other factors influencing individual radioresistance are: radioadaptive response; bystander effect; levels of endogenous substances with radioprotective and antimutagenic properties and environmental factors such as lifestyle and diet, physical activity, psychoemotional state, hormonal state, certain drugs, infections and others. These factors may have radioprotective or sensibilising effects. Apparently, there are too many factors that may significantly modulate the biological effects of ionising radiation. Thus, conventional methodologies for biodosimetry (specifically, cytogenetic methods) may produce significant errors if personal traits that may affect radioresistance are not accounted for.
Resumo:
Interactions in mobile devices normally happen in an explicit manner, which means that they are initiated by the users. Yet, users are typically unaware that they also interact implicitly with their devices. For instance, our hand pose changes naturally when we type text messages. Whilst the touchscreen captures finger touches, hand movements during this interaction however are unused. If this implicit hand movement is observed, it can be used as additional information to support or to enhance the users’ text entry experience. This thesis investigates how implicit sensing can be used to improve existing, standard interaction technique qualities. In particular, this thesis looks into enhancing front-of-device interaction through back-of-device and hand movement implicit sensing. We propose the investigation through machine learning techniques. We look into problems on how sensor data via implicit sensing can be used to predict a certain aspect of an interaction. For instance, one of the questions that this thesis attempts to answer is whether hand movement during a touch targeting task correlates with the touch position. This is a complex relationship to understand but can be best explained through machine learning. Using machine learning as a tool, such correlation can be measured, quantified, understood and used to make predictions on future touch position. Furthermore, this thesis also evaluates the predictive power of the sensor data. We show this through a number of studies. In Chapter 5 we show that probabilistic modelling of sensor inputs and recorded touch locations can be used to predict the general area of future touches on touchscreen. In Chapter 7, using SVM classifiers, we show that data from implicit sensing from general mobile interactions is user-specific. This can be used to identify users implicitly. In Chapter 6, we also show that touch interaction errors can be detected from sensor data. In our experiment, we show that there are sufficient distinguishable patterns between normal interaction signals and signals that are strongly correlated with interaction error. In all studies, we show that performance gain can be achieved by combining sensor inputs.