958 resultados para Check-In
Resumo:
Research on the topic of liquidity has greatly benefited from the improved availability of data. Researchers have addressed questions regarding the factors that influence bid-ask spreads and the relationship between spreads and risk, return and liquidity. Intra-day data have been used to measure the effective spread and researchers have been able to refine the concepts of liquidity to include the price impact of transactions on a trade-by-trade analysis. The growth in the creation of tax-transparent securities has greatly enhanced the visibility of securitized real estate, and has naturally led to the question of whether the increased visibility of real estate has caused market liquidity to change. Although the growth in the public market for securitized real estate has occurred in international markets, it has not been accompanied by universal publication of transaction data. Therefore this paper develops an aggregate daily data-based test for liquidity and applies the test to US data in order to check for consistency with the results of prior intra-day analysis. If the two approaches produce similar results, we can apply the same technique to markets in which less detailed data are available and offer conclusions on the liquidity of a wider set of markets.
Resumo:
The objective of this study was to investigate whether Salkovskis (1985) inflated responsibility model of obsessive-compulsive disorder (OCD) applied to children. In an experimental design, 81 children aged 9– 12 years were randomly allocated to three conditions: an inflated responsibility group, a moderate responsibility group, and a reduced responsibility group. In all groups children were asked to sort sweets according to whether or not they contained nuts. At baseline the groups did not differ on children’s self reported anxiety, depression, obsessive-compulsive symptoms or on inflated responsibility beliefs. The experimental manipulation successfully changed children’s perceptions of responsibility. During the sorting task time taken to complete the task, checking behaviours, hesitations, and anxiety were recorded. There was a significant effect of responsibility level on the behavioural variables of time taken, hesitations and check; as perceived responsibility increased children took longer to complete the task and checked and hesitated more often. There was no between-group difference in children’s self reported state anxiety. The results offer preliminary support for the link between inflated responsibility and increased checking behaviours in children and add to the small but growing literature suggesting that cognitive models of OCD may apply to children.
Resumo:
A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.
Resumo:
Introduction: Care home residents are at particular risk from medication errors, and our objective was to determine the prevalence and potential harm of prescribing, monitoring, dispensing and administration errors in UK care homes, and to identify their causes. Methods: A prospective study of a random sample of residents within a purposive sample of homes in three areas. Errors were identified by patient interview, note review, observation of practice and examination of dispensed items. Causes were understood by observation and from theoretically framed interviews with home staff, doctors and pharmacists. Potential harm from errors was assessed by expert judgement. Results: The 256 residents recruited in 55 homes were taking a mean of 8.0 medicines. One hundred and seventy-eight (69.5%) of residents had one or more errors. The mean number per resident was 1.9 errors. The mean potential harm from prescribing, monitoring, administration and dispensing errors was 2.6, 3.7, 2.1 and 2.0 (0 = no harm, 10 = death), respectively. Contributing factors from the 89 interviews included doctors who were not accessible, did not know the residents and lacked information in homes when prescribing; home staff’s high workload, lack of medicines training and drug round interruptions; lack of team work among home, practice and pharmacy; inefficient ordering systems; inaccurate medicine records and prevalence of verbal communication; and difficult to fill (and check) medication administration systems. Conclusions: That two thirds of residents were exposed to one or more medication errors is of concern. The will to improve exists, but there is a lack of overall responsibility. Action is required from all concerned.
Resumo:
This paper describes the hydrochemistry of a lowland, urbanised river-system, The Cut in England, using in situ sub-daily sampling. The Cut receives effluent discharges from four major sewage treatment works serving around 190,000 people. These discharges consist largely of treated water, originally abstracted from the River Thames and returned via the water supply network, substantially increasing the natural flow. The hourly water quality data were supplemented by weekly manual sampling with laboratory analysis to check the hourly data and measure further determinands. Mean phosphorus and nitrate concentrations were very high, breaching standards set by EU legislation. Though 56% of the catchment area is agricultural, the hydrochemical dynamics were significantly impacted by effluent discharges which accounted for approximately 50% of the annual P catchment input loads and, on average, 59% of river flow at the monitoring point. Diurnal dissolved oxygen data demonstrated high in-stream productivity. From a comparison of high frequency and conventional monitoring data, it is inferred that much of the primary production was dominated by benthic algae, largely diatoms. Despite the high productivity and nutrient concentrations, the river water did not become anoxic and major phytoplankton blooms were not observed. The strong diurnal and annual variation observed showed that assessments of water quality made under the Water Framework Directive (WFD) are sensitive to the time and season of sampling. It is recommended that specific sampling time windows be specified for each determinand, and that WFD targets should be applied in combination to help identify periods of greatest ecological risk. This article is protected by copyright. All rights reserved.
Resumo:
In 2004 the National Household Survey (Pesquisa Nacional par Amostras de Domicilios - PNAD) estimated the prevalence of food and nutrition insecurity in Brazil. However, PNAD data cannot be disaggregated at the municipal level. The objective of this study was to build a statistical model to predict severe food insecurity for Brazilian municipalities based on the PNAD dataset. Exclusion criteria were: incomplete food security data (19.30%); informants younger than 18 years old (0.07%); collective households (0.05%); households headed by indigenous persons (0.19%). The modeling was carried out in three stages, beginning with the selection of variables related to food insecurity using univariate logistic regression. The variables chosen to construct the municipal estimates were selected from those included in PNAD as well as the 2000 Census. Multivariate logistic regression was then initiated, removing the non-significant variables with odds ratios adjusted by multiple logistic regression. The Wald Test was applied to check the significance of the coefficients in the logistic equation. The final model included the variables: per capita income; years of schooling; race and gender of the household head; urban or rural residence; access to public water supply; presence of children; total number of household inhabitants and state of residence. The adequacy of the model was tested using the Hosmer-Lemeshow test (p=0.561) and ROC curve (area=0.823). Tests indicated that the model has strong predictive power and can be used to determine household food insecurity in Brazilian municipalities, suggesting that similar predictive models may be useful tools in other Latin American countries.
Resumo:
Aim It is well reported in the scientific literature that there is a high level of periodontal disease and lower caries prevalence in Down Syndrome (DS) individuals, when compared with age-matched non DS individuals. This study was conducted to investigate the process of dental caries in DS children. Materials and methods In this study the following parameters were considered: oral hygiene habits, levels of Streptococcus mutans (SM) and Lactobacillus spp. (LB), Modified Gingival Index (MGI), and Simplified Oral Hygiene Index (OHI-S). A case group with DS children (n=69) and a control group of non DS children (n=69) were formed to perform this study Dental caries severity was determined using the DMFT index. Samples of non-stimulated saliva were collected to determine the Lactobacillus spp levels. For SM levels, MSB agar plates were used. Results The findings revealed that the case group attended, dental check-ups more frequently brushed their teeth more times per day, flossed less, and also more frequently had SM levels classified as ""high count"". The MGI was higher and the OHI-S was lower than the control group (p<0.001). Conclusion No significant differences were found between the DMFT indexes of children from the two groups (p=0.345). The logistic regression analysis showed that in the case group, age, MGI, and SM count were positively related to dental caries (p<0.05).
Resumo:
Purpose: The interference of electric fields (EF) with biological processes is an issue of considerable interest. No studies have as yet been reported on the combined effect of EF plus ionising radiation. Here we report studies on this combined effect using the prokaryote Microcystis panniformis, the eukaryote Candida albicans and human cells. Materials and methods: Cultures of Microcystis panniformis (Cyanobacteria) in glass tubes were irradiated with doses in the interval 0.5-5kGy, using a 60Co gamma source facility. Samples irradiated with 3kGy were exposed for 2h to a 20Vcm-1 static electric field and viable cells were enumerated. Cultures of Candida albicans were incubated at 36C for 20h, gamma-irradiated with doses from 1-4kGy, and submitted to an electric field of 180Vcm-1. Samples were examined under a fluorescence microscope and the number of unviable (red) and viable (apple green fluorescence) cells was determined. For crossing-check purposes, MRC5 strain of lung cells were irradiated with 2 Gy, exposed to an electric field of 1250 V/cm, incubated overnight with the anti-body anti-phospho-histone H2AX and examined under a fluorescence microscope to quantify nuclei with -H2AX foci. Results: In cells exposed to EF, death increased substantially compared to irradiation alone. In C. albicans we observed suppression of the DNA repair shoulder. The effect of EF in growth of M. panniformis was substantial; the number of surviving cells on day-2 after irradiation was 12 times greater than when an EF was applied. By the action of a static electric field on the irradiated MRC5 cells the number of nuclei with -H2AX foci increased 40%, approximately. Conclusions: Application of an EF following irradiation greatly increases cell death. The observation that the DNA repair shoulder in the survival curve of C. albicans is suppressed when cells are exposed to irradiation+EF suggests that EF likely inactivate cellular recovering processes. The result for the number of nuclei with -H2AX foci in MRC5 cells indicates that an EF interferes mostly in the DNA repair mechanisms. A molecular ad-hoc model is proposed.
Resumo:
We investigate the critical behavior of a stochastic lattice model describing a predator-prey system. By means of Monte Carlo procedure we simulate the model defined on a regular square lattice and determine the threshold of species coexistence, that is, the critical phase boundaries related to the transition between an active state, where both species coexist and an absorbing state where one of the species is extinct. A finite size scaling analysis is employed to determine the order parameter, order parameter fluctuations, correlation length and the critical exponents. Our numerical results for the critical exponents agree with those of the directed percolation universality class. We also check the validity of the hyperscaling relation and present the data collapse curves.
Resumo:
One of the key issues in e-learning environments is the possibility of creating and evaluating exercises. However, the lack of tools supporting the authoring and automatic checking of exercises for specifics topics (e.g., geometry) drastically reduces advantages in the use of e-learning environments on a larger scale, as usually happens in Brazil. This paper describes an algorithm, and a tool based on it, designed for the authoring and automatic checking of geometry exercises. The algorithm dynamically compares the distances between the geometric objects of the student`s solution and the template`s solution, provided by the author of the exercise. Each solution is a geometric construction which is considered a function receiving geometric objects (input) and returning other geometric objects (output). Thus, for a given problem, if we know one function (construction) that solves the problem, we can compare it to any other function to check whether they are equivalent or not. Two functions are equivalent if, and only if, they have the same output when the same input is applied. If the student`s solution is equivalent to the template`s solution, then we consider the student`s solution as a correct solution. Our software utility provides both authoring and checking tools to work directly on the Internet, together with learning management systems. These tools are implemented using the dynamic geometry software, iGeom, which has been used in a geometry course since 2004 and has a successful track record in the classroom. Empowered with these new features, iGeom simplifies teachers` tasks, solves non-trivial problems in student solutions and helps to increase student motivation by providing feedback in real time. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
We construct five new elements of degree 6 in the nucleus of the free alternative algebra. We use the representation theory of the symmetric group to locate the elements. We use the computer algebra system ALBERT and an extension of ALBERT to express the elements in compact form and to show that these new elements are not a consequence of the known clegree-5 elements in the nucleus. We prove that these five new elements and four known elements form a basis for the subspace of nuclear elements of degree 6. Our calculations are done using modular arithmetic to save memory and time. The calculations can be done in characteristic zero or any prime greater than 6, and similar results are expected. We generated the nuclear elements using prime 103. We check our answer using five other primes.
Resumo:
Depiction of closed circuit TV students at the New York Trade School filming a voltage regulator check performed by William C. H. Meyer. Original caption reads, "Closed-circuit TV takes a class at the New York Trade School into the Automotive Shop where William C. H. Meyer, head of the Automotive Department, demonstrates a voltage-regulator check. Students Robert Niefeld (left) and Denis Mahoney serve as cameramen." Black and white photograph part of series of four photographs accompanying a press release of the New York Trade School announcing the demonstration of a new technique in closed-circuit TV developed at the New York Trade School.
Resumo:
The main purpose of this thesis project is to prediction of symptom severity and cause in data from test battery of the Parkinson’s disease patient, which is based on data mining. The collection of the data is from test battery on a hand in computer. We use the Chi-Square method and check which variables are important and which are not important. Then we apply different data mining techniques on our normalize data and check which technique or method gives good results.The implementation of this thesis is in WEKA. We normalize our data and then apply different methods on this data. The methods which we used are Naïve Bayes, CART and KNN. We draw the Bland Altman and Spearman’s Correlation for checking the final results and prediction of data. The Bland Altman tells how the percentage of our confident level in this data is correct and Spearman’s Correlation tells us our relationship is strong. On the basis of results and analysis we see all three methods give nearly same results. But if we see our CART (J48 Decision Tree) it gives good result of under predicted and over predicted values that’s lies between -2 to +2. The correlation between the Actual and Predicted values is 0,794in CART. Cause gives the better percentage classification result then disability because it can use two classes.
Resumo:
The advancement of GPS technology enables GPS devices not only to be used as orientation and navigation tools, but also to track travelled routes. GPS tracking data provides essential information for a broad range of urban planning applications such as transportation routing and planning, traffic management and environmental control. This paper describes on processing the data that was collected by tracking the cars of 316 volunteers over a seven-week period. The detailed information is extracted. The processed data is further connected to the underlying road network by means of maps. Geographical maps are applied to check how the car-movements match the road network. The maps capture the complexity of the car-movements in the urban area. The results show that 90% of the trips on the plane match the road network within a tolerance.
Resumo:
One of a series of photographs accompanying a press release by the New York Trade School announcing the development and demonstration of a new technique in closed-circuit TV. Student Dennis Mahoney serves as one of the cameramen as William C. H. Meyers of the Automotive Department performs the demonstration. Black and white photograph.