967 resultados para Calculus, Operational.
Resumo:
ABSTRACT This paper provides evidence on the market reaction to corporate investment decisions whose shareholder value is largely attributed to growth options. The exploratory research raised pre-operational companies and their operational pairs on the same economy segments. It had the purpose of investigating the existence of statistical differentiation from financial indicators that reflect the installed assets and growth assets, and then study the market reaction to changes in fixed assets as a signaling element about investment decisions. The formation process of operational assets and shareholder value almost exclusively dependent on asset growth stands out in the pre-operational companies. As a result, differentiation tests confirmed that the pre-operational companies had their value especially derived on growth options. The market reaction was particularly bigger in pre-operational companies with abnormal negative stock returns, while the operational companies had positive returns, which may indicate that the quality of the investment is judged based on the financial disclosure. Additionally, operational companies' investors await the disclosure to adjust their prices. We conclude that the results are consistent with the empirical evidence and the participants in financial markets to long-term capital formation investments should give that special attention.
Resumo:
Theory predicts that males adapt to sperm competition by increasing their investment in testis mass to transfer larger ejaculates. Experimental and comparative data support this prediction. Nevertheless, the relative importance of sperm competition in testis size evolution remains elusive, because experiments vary only sperm competition whereas comparative approaches confound it with other variables, in particular male mating rate. We addressed the relative importance of sperm competition and male mating rate by taking an experimental evolution approach. We subjected populations of Drosophila melanogaster to sex ratios of 1:1, 4:1, and 10:1 (female:male). Female bias decreased sperm competition but increased male mating rate and sperm depletion. After 28 generations of evolution, males from the 10:1 treatment had larger testes than males from other treatments. Thus, testis size evolved in response to mating rate and sperm depletion, not sperm competition. Furthermore, our experiment demonstrated that drift associated with sex ratio distortion limits adaptation; testis size only evolved in populations in which the effect of sex ratio bias on the effective population size had been compensated by increasing the numerical size. We discuss these results with respect to reproductive evolution, genetic drift in natural and experimental populations, and consequences of natural sex ratio distortion.
Resumo:
We introduce a variation of the proof for weak approximations that issuitable for studying the densities of stochastic processes which areevaluations of the flow generated by a stochastic differential equation on a random variable that maybe anticipating. Our main assumption is that the process and the initial random variable have to be smooth in the Malliavin sense. Furthermore if the inverse of the Malliavin covariance matrix associated with the process under consideration is sufficiently integrable then approximations fordensities and distributions can also be achieved. We apply theseideas to the case of stochastic differential equations with boundaryconditions and the composition of two diffusions.
Resumo:
This article is an introduction to Malliavin Calculus for practitioners.We treat one specific application to the calculation of greeks in Finance.We consider also the kernel density method to compute greeks and anextension of the Vega index called the local vega index.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The planning effort for ISP began in 2006 when the IDOC retained the Durrant/PBA team of architects and planners to review the Iowa correctional system. The team conducted two studies in the following two years, the first being the April 2007 Iowa Department of Corrections Systemic Master Plan. Both studies addressed myriad aspects of the correctional system including treatment and re-entry needs and programs, security and training, and staffing.
Resumo:
Excessive daytime sleepiness underpins a large number of the reported motor vehicle crashes. Fair and accurate field measures are needed to identify at-risk drivers who have been identified as potentially driving in a sleep deprived state on the basis of erratic driving behavior. The purpose of this research study was to evaluate a set of cognitive tests that can assist Motor Vehicle Enforcement Officers on duty in identifying drivers who may be engaged in sleep impaired driving. Currently no gold standard test exists to judge sleepiness in the field. Previous research has shown that Psychomotor Vigilance Task (PVT) is sensitive to sleep deprivation. The first goal of the current study was to evaluate whether computerized tests of attention and memory, more brief than PVT, would be as sensitive to sleepiness effects. The second goal of the study was to evaluate whether objective and subjective indices of acute and cumulative sleepiness predicted cognitive performance. Findings showed that sleepiness effects were detected in three out of six tasks. Furthermore, PVT was the only task that showed a consistent slowing of both ‘best’, i.e. minimum, and ‘typical’ responses, median RT due to sleepiness. However, PVT failed to show significant associations with objective measures of sleep deprivation (number of hours awake). The findings indicate that sleepiness tests in the field have significant limitations. The findings clearly show that it will not be possible to set absolute performance thresholds to identify sleep-impaired drivers based on cognitive performance on any test. Cooperation with industry to adjust work and rest cycles, and incentives to comply with those regulations will be critical components of a broad policy to prevent sleepy truck drivers from getting on the road.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Research projects aimed at proposing fingerprint statistical models based on the likelihood ratio framework have shown that low quality finger impressions left on crime scenes may have significant evidential value. These impressions are currently either not recovered, considered to be of no value when first analyzed by fingerprint examiners, or lead to inconclusive results when compared to control prints. There are growing concerns within the fingerprint community that recovering and examining these low quality impressions will result in a significant increase of the workload of fingerprint units and ultimately of the number of backlogged cases. This study was designed to measure the number of impressions currently not recovered or not considered for examination, and to assess the usefulness of these impressions in terms of the number of additional detections that would result from their examination.
Resumo:
The cost of operational risk refers to the capital needed to a fford the loss generated by ordinary activities of a firm. In this work we demonstrate how allocation principles can be used to the subdivision of the aggregate capital so that the firm can distribute this cost across its various constituents that generate operational risk. Several capital allocation principles are revised. Proportional allocation allows to calculate a relative risk premium to be charged to each unit. An example of fraud risk in the banking sector is presented and some correlation scenarios between business lines are compared. Keywords: solvency, quantile, value at risk, copulas
Resumo:
This report outlines the strategic plan for Iowa Department of Public Health, goals and mission.
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
Numerical weather prediction and climate simulation have been among the computationally most demanding applications of high performance computing eversince they were started in the 1950's. Since the 1980's, the most powerful computers have featured an ever larger number of processors. By the early 2000's, this number is often several thousand. An operational weather model must use all these processors in a highly coordinated fashion. The critical resource in running such models is not computation, but the amount of necessary communication between the processors. The communication capacity of parallel computers often fallsfar short of their computational power. The articles in this thesis cover fourteen years of research into how to harness thousands of processors on a single weather forecast or climate simulation, so that the application can benefit as much as possible from the power of parallel high performance computers. The resultsattained in these articles have already been widely applied, so that currently most of the organizations that carry out global weather forecasting or climate simulation anywhere in the world use methods introduced in them. Some further studies extend parallelization opportunities into other parts of the weather forecasting environment, in particular to data assimilation of satellite observations.