900 resultados para Data Structures, Cryptology and Information Theory
Resumo:
Background: A prerequisite for high performance in motor tasks is the acquisition of egocentric sensory information that must be translated into motor actions. A phenomenon that supports this process is the Quiet Eye (QE) defined as long final fixation before movement initiation. It is assumed that the QE facilitates information processing, particularly regarding movement parameterization. Aims: The question remains whether this facilitation also holds for the information-processing stage of response selection and – related to perception crucial – stage of stimulus identification. Method: In two experiments with sport science students, performance-enhancing effects of experimentally manipulated QE durations were tested as a function of target position predictability and target visibility, thereby selectively manipulating response selection and stimulus identification demands, respectively. Results: The results support the hypothesis of facilitated information processing through long QE durations since in both experiments performance-enhancing effects of long QE durations were found under increased processing demands only. In Experiment 1, QE duration affected performance only if the target position was not predictable and positional information had to be processed over the QE period. In Experiment 2, in a full vs. no target visibility comparison with saccades to the upcoming target position induced by flicker cues, the functionality of a long QE duration depended on the visual stimulus identification period as soon as the interval falls below a certain threshold. Conclusions: The results corroborate earlier findings that QE efficiency depends on demands put on the visuomotor system, thereby furthering the assumption that the phenomenon supports the processes of sensorimotor integration.
Resumo:
New pollen based reconstructions of summer (May-to-August) and winter (December-to-February) temperatures between 15 and 8 ka BP along a S-N transect in the Baltic-Belarus (BB) area display trends in temporal and spatial changes in climate variability. These results are completed by two chironomid-based July mean temperature reconstructions. The magnitude of change compared with modern temperatures was more prominent in the northern part of BB area. The 4 C degrees winter and 2 C degrees summer warming at the start of GI-1 was delayed in the BB area and Lateglacial maximum temperatures were reached at ca 13.6 ka BP, being 4 C degrees colder than the modern mean. The Younger Dryas cooling in the area was 5 C degrees colder than present, as inferred by all proxies. In addition, our analyses show an early Holocene divergence in winter temperature trends with modern values reaching 1 ka earlier (10 ka BP) in southern BB compared to the northern part of the region (9 ka BP).
Resumo:
Numerical calculations describing weathering of the Poços de Caldas alkaline complex (Minas Gerais, Brazil) by infiltrating groundwater are carried out for time spans up to two million years in the absence of pyrite, and up to 500,000 years with pyrite present. Deposition of uranium resulting from infiltration of oxygenated, uranium bearing groundwater through the hydrothermally altered phonolitic host rock at the Osamu Utsumi uranium mine is also included in the latter calculation. The calculations are based on the quasi-stationary state approximation to mass conservation equations for pure advective transport. This approximation enables the prediction of solute concentrations, mineral abundances and porosity as functions of time and distance over geologic time spans. Mineral reactions are described by kinetic rate laws for both precipitation and dissolution. Homogeneous equilibrium is assumed to be maintained within the aqueous phase. No other constraints are imposed on the calculations other than the initial composition of the unaltered host rock and the composition of the inlet fluid, taken as rainwater modified by percolation through a soil zone. The results are in qualitative agreement with field observations at the Osamu Utsumi uranium mine. They predict a lateritic cover followed by a highly porous saprolitic zone, a zone of oxidized rock with pyrite replaced by iron-hydroxide, a sharp redox front at which uranium is deposited, and the reduced unweathered host rock. Uranium is deposited in a narrow zone located on the reduced side of the redox front in association with pyrite, in agreement with field observations. The calculations predict the formation of a broad dissolution front of primary kaolinite that penetrates deep into the host rock accompanied by the precipitation of secondary illite. Secondary kaolinite occurs in a saprolitic zone near the surface and in the vicinity of the redox front. Gibbsite forms a bi-modal distribution consisting of a maximum near the surface followed by a thin tongue extending downward into the weathered profile in agreement with field observations. The results are found to be insensitive to the kinetic rate constants used to describe mineral reactions.
Resumo:
Index tracking has become one of the most common strategies in asset management. The index-tracking problem consists of constructing a portfolio that replicates the future performance of an index by including only a subset of the index constituents in the portfolio. Finding the most representative subset is challenging when the number of stocks in the index is large. We introduce a new three-stage approach that at first identifies promising subsets by employing data-mining techniques, then determines the stock weights in the subsets using mixed-binary linear programming, and finally evaluates the subsets based on cross validation. The best subset is returned as the tracking portfolio. Our approach outperforms state-of-the-art methods in terms of out-of-sample performance and running times.
Resumo:
Accurate rainfall data are the key input parameter for modelling river discharge and soil loss. Remote areas of Ethiopia often lack adequate precipitation data and where these data are available, there might be substantial temporal or spatial gaps. To counter this challenge, the Climate Forecast System Reanalysis (CFSR) of the National Centers for Environmental Prediction (NCEP) readily provides weather data for any geographic location on earth between 1979 and 2014. This study assesses the applicability of CFSR weather data to three watersheds in the Blue Nile Basin in Ethiopia. To this end, the Soil and Water Assessment Tool (SWAT) was set up to simulate discharge and soil loss, using CFSR and conventional weather data, in three small-scale watersheds ranging from 112 to 477 ha. Calibrated simulation results were compared to observed river discharge and observed soil loss over a period of 32 years. The conventional weather data resulted in very good discharge outputs for all three watersheds, while the CFSR weather data resulted in unsatisfactory discharge outputs for all of the three gauging stations. Soil loss simulation with conventional weather inputs yielded satisfactory outputs for two of three watersheds, while the CFSR weather input resulted in three unsatisfactory results. Overall, the simulations with the conventional data resulted in far better results for discharge and soil loss than simulations with CFSR data. The simulations with CFSR data were unable to adequately represent the specific regional climate for the three watersheds, performing even worse in climatic areas with two rainy seasons. Hence, CFSR data should not be used lightly in remote areas with no conventional weather data where no prior analysis is possible.
Resumo:
Abstract As librarians of the Social & Preventive Medicine Library in Bern, we help researchers perform systematic literature searches and teach students to use medical databases. We developed our skills mainly “on the job”, and we wondered how other health librarians in Europe were trained to become experts in searching. We had a great opportunity to “job shadow” specialists in this area of library service during a 5-day-internship at the Royal Free Hospital Medical Library in London, Great Britain.
Resumo:
The entrepreneurial theory of the firm argues that entrepreneurship, properly understood, is a crucial but neglected element in explaining the nature and boundaries of the firm. By contrast, the theory of the entrepreneurial firm presumably seeks not to understand the nature and boundaries of "the firm" in general but rather to understand a particular type of firm: one that is entrepreneurial. This paper is an attempt to reconcile the two. After briefly delving for the concept of entrepreneurship in the work of Schumpeter, Kirzner, and (especially) Knight, the paper makes the case for the entrepreneurial theory of the firm. In such a theory, the firm exists as the solution to a coordination problem in a world of change and uncertainty, including Knightian or structural uncertainty. Taking a historical or developmental perspective, the paper then examines the changing nature of the entrepreneurial coordination problem over the life-cycle. In this formulation, "the entrepreneurial firm" is a nascent firm or proto-firm facing a problem of coordinating systemic change in economic capabilities. Lacking (by definition) adequate guidance from existing systems of rules of conduct embedded in markets or organizations, the entrepreneurial firm typically relies on a form of organization Max Weber called charismatic authority. In the end, although there is no such thing as a non-entrepreneurial firm, firms that must solve coordination problems in a world of novelty and systemic change ("entrepreneurial firms") are perhaps the purest case of the entrepreneurial theory of the firm.
Resumo:
Rotations are an integral part of the study of rotational spectroscopy, as well as a part of group theory, hence this introduction.
Resumo:
Purpose: Traditional patient-specific IMRT QA measurements are labor intensive and consume machine time. Calculation-based IMRT QA methods typically are not comprehensive. We have developed a comprehensive calculation-based IMRT QA method to detect uncertainties introduced by the initial dose calculation, the data transfer through the Record-and-Verify (R&V) system, and various aspects of the physical delivery. Methods: We recomputed the treatment plans in the patient geometry for 48 cases using data from the R&V, and from the delivery unit to calculate the “as-transferred” and “as-delivered” doses respectively. These data were sent to the original TPS to verify transfer and delivery or to a second TPS to verify the original calculation. For each dataset we examined the dose computed from the R&V record (RV) and from the delivery records (Tx), and the dose computed with a second verification TPS (vTPS). Each verification dose was compared to the clinical dose distribution using 3D gamma analysis and by comparison of mean dose and ROI-specific dose levels to target volumes. Plans were also compared to IMRT QA absolute and relative dose measurements. Results: The average 3D gamma passing percentages using 3%-3mm, 2%-2mm, and 1%-1mm criteria for the RV plan were 100.0 (σ=0.0), 100.0 (σ=0.0), and 100.0 (σ=0.1); for the Tx plan they were 100.0 (σ=0.0), 100.0 (σ=0.0), and 99.0 (σ=1.4); and for the vTPS plan they were 99.3 (σ=0.6), 97.2 (σ=1.5), and 79.0 (σ=8.6). When comparing target volume doses in the RV, Tx, and vTPS plans to the clinical plans, the average ratios of ROI mean doses were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.990 (σ=0.009) and ROI-specific dose levels were 0.999 (σ=0.001), 1.001 (σ=0.002), and 0.980 (σ=0.043), respectively. Comparing the clinical, RV, TR, and vTPS calculated doses to the IMRT QA measurements for all 48 patients, the average ratios for absolute doses were 0.999 (σ=0.013), 0.998 (σ=0.013), 0.999 σ=0.015), and 0.990 (σ=0.012), respectively, and the average 2D gamma(5%-3mm) passing percentages for relative doses for 9 patients was were 99.36 (σ=0.68), 99.50 (σ=0.49), 99.13 (σ=0.84), and 98.76 (σ=1.66), respectively. Conclusions: Together with mechanical and dosimetric QA, our calculation-based IMRT QA method promises to minimize the need for patient-specific QA measurements by identifying outliers in need of further review.
Resumo:
An interim analysis is usually applied in later phase II or phase III trials to find convincing evidence of a significant treatment difference that may lead to trial termination at an earlier point than planned at the beginning. This can result in the saving of patient resources and shortening of drug development and approval time. In addition, ethics and economics are also the reasons to stop a trial earlier. In clinical trials of eyes, ears, knees, arms, kidneys, lungs, and other clustered treatments, data may include distribution-free random variables with matched and unmatched subjects in one study. It is important to properly include both subjects in the interim and the final analyses so that the maximum efficiency of statistical and clinical inferences can be obtained at different stages of the trials. So far, no publication has applied a statistical method for distribution-free data with matched and unmatched subjects in the interim analysis of clinical trials. In this simulation study, the hybrid statistic was used to estimate the empirical powers and the empirical type I errors among the simulated datasets with different sample sizes, different effect sizes, different correlation coefficients for matched pairs, and different data distributions, respectively, in the interim and final analysis with 4 different group sequential methods. Empirical powers and empirical type I errors were also compared to those estimated by using the meta-analysis t-test among the same simulated datasets. Results from this simulation study show that, compared to the meta-analysis t-test commonly used for data with normally distributed observations, the hybrid statistic has a greater power for data observed from normally, log-normally, and multinomially distributed random variables with matched and unmatched subjects and with outliers. Powers rose with the increase in sample size, effect size, and correlation coefficient for the matched pairs. In addition, lower type I errors were observed estimated by using the hybrid statistic, which indicates that this test is also conservative for data with outliers in the interim analysis of clinical trials.^
Resumo:
In light of the new healthcare regulations, hospitals are increasingly reevaluating their IT integration strategies to meet expanded healthcare information exchange requirements. Nevertheless, hospital executives do not have all the information they need to differentiate between the available strategies and recognize what may better fit their organizational needs. ^ In the interest of providing the desired information, this study explored the relationships between hospital financial performance, integration strategy selection, and strategy change. The integration strategies examined – applied as binary logistic regression dependent variables and in the order from most to least integrated – were Single-Vendor (SV), Best-of-Suite (BoS), and Best-of-Breed (BoB). In addition, the financial measurements adopted as independent variables for the models were two administrative labor efficiency and six industry standard financial ratios designed to provide a broad proxy of hospital financial performance. Furthermore, descriptive statistical analyses were carried out to evaluate recent trends in hospital integration strategy change. Overall six research questions were proposed for this study. ^ The first research question sought to answer if financial performance was related to the selection of integration strategies. The next questions, however, explored whether hospitals were more likely to change strategies or remain the same when there was no external stimulus to change, and if they did change, they would prefer strategies closer to the existing ones. These were followed by a question that inquired if financial performance was also related to strategy change. Nevertheless, rounding up the questions, the last two probed if the new Health Information Technology for Economic and Clinical Health (HITECH) Act had any impact on the frequency and direction of strategy change. ^ The results confirmed that financial performance is related to both IT integration strategy selection and strategy change, while concurred with prior studies that suggested hospital and environmental characteristics are associated factors as well. Specifically this study noted that the most integrated SV strategy is related to increased administrative labor efficiency and the hybrid BoS strategy is associated with improved financial health (based on operating margin and equity financing ratios). On the other hand, no financial indicators were found to be related to the least integrated BoB strategy, except for short-term liquidity (current ratio) when involving strategy change. ^ Ultimately, this study concluded that when making IT integration strategy decisions hospitals closely follow the resource dependence view of minimizing uncertainty. As each integration strategy may favor certain organizational characteristics, hospitals traditionally preferred not to make strategy changes and when they did, they selected strategies that were more closely related to the existing ones. However, as new regulations further heighten revenue uncertainty while require increased information integration, moving forward, as evidence already suggests a growing trend of organizations shifting towards more integrated strategies, hospitals may be more limited in their strategy selection choices.^