946 resultados para Practical algorithm


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is no doubt about the necessity of protecting digital communication: Citizens are entrusting their most confidential and sensitive data to digital processing and communication, and so do governments, corporations, and armed forces. Digital communication networks are also an integral component of many critical infrastructures we are seriously depending on in our daily lives. Transportation services, financial services, energy grids, food production and distribution networks are only a few examples of such infrastructures. Protecting digital communication means protecting confidentiality and integrity by encrypting and authenticating its contents. But most digital communication is not secure today. Nevertheless, some of the most ardent problems could be solved with a more stringent use of current cryptographic technologies. Quite surprisingly, a new cryptographic primitive emerges from the ap-plication of quantum mechanics to information and communication theory: Quantum Key Distribution. QKD is difficult to understand, it is complex, technically challenging, and costly-yet it enables two parties to share a secret key for use in any subsequent cryptographic task, with an unprecedented long-term security. It is disputed, whether technically and economically fea-sible applications can be found. Our vision is, that despite technical difficulty and inherent limitations, Quantum Key Distribution has a great potential and fits well with other cryptographic primitives, enabling the development of highly secure new applications and services. In this thesis we take a structured approach to analyze the practical applicability of QKD and display several use cases of different complexity, for which it can be a technology of choice, either because of its unique forward security features, or because of its practicability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years many clinical prediction rules (CPR) have been developed. Before a CPR can be used in clinical practice, different methodical steps are necessary, from the development of the score, the internal and external validation to the impact study. Before using a CPR in daily practice family doctors have to verify how the rules have been developed and whether this has been done in a population similar to the population in which they would use them. The aim of this paper is to describe the development of a CPR, and to discuss advantages and risks related to the use of CPR in order to help family doctors in their choice of scores for use in their daily practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Des dels inicis dels ordinadors com a màquines programables, l’home ha intentat dotar-los de certa intel•ligència per tal de pensar o raonar el més semblant possible als humans. Un d’aquests intents ha sigut fer que la màquina sigui capaç de pensar de tal manera que estudiï jugades i guanyi partides d’escacs. En l’actualitat amb els actuals sistemes multi tasca, orientat a objectes i accés a memòria i gràcies al potent hardware del que disposem, comptem amb una gran varietat de programes que es dediquen a jugar a escacs. Però no hi ha només programes petits, hi ha fins i tot màquines senceres dedicades a calcular i estudiar jugades per tal de guanyar als millors jugadors del món. L’objectiu del meu treball és dur a terme un estudi i implementació d’un d’aquests programes, per això es divideix en dues parts. La part teòrica o de l’estudi, consta d’un estudi dels sistemes d’intel•ligència artificial que es dediquen a jugar a escacs, estudi i cerca d’una funció d’avaluació vàlida i estudi dels algorismes de cerca. La part pràctica del treball es basa en la implementació d’un sistema intel•ligent capaç de jugar a escacs amb certa lògica. Aquesta implementació es porta a terme amb l’ajuda de les llibreries SDL, utilitzant l’algorisme minimax amb poda alfa-beta i codi c++. Com a conclusió del projecte m’agradaria remarcar que l’estudi realitzat m’ha deixat veure que crear un joc d’escacs no era tan fàcil com jo pensava però m’ha aportat la satisfacció d’aplicar tot el que he après durant la carrera i de descobrir moltes altres coses noves.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: Given the preponderance of education reform since the No Child Left Behind Act (U.S. Department of Education, 2001), reform efforts have shaped the nature of the work and culture in schools. The emphasis on standardized testing to determine schools' status and student performance, among other factors, has generated stress, particularly for teachers. Therefore, district and school administrators are encouraged to consider the contextual factors that contribute to teacher stress to address them and to retain high-performing teachers. Research Methods/Approach: Participants were recruited from two types of schools in order to test hypotheses related to directional responding as a function of working in a more challenging (high-priority) or less challenging (non-high-priority) school environment. We employed content analysis to analyze 64 suburban elementary school teachers' free-responses to a prompt regarding their stress as teachers. We cross-analyzed our findings through external auditing to bolster trustworthiness in the data and in the procedure. Findings: Teachers reported personal and contextual stressors. Herein, we reported concrete examples of the five categories of contextual stressors teachers identified: political and educational structures, instructional factors, student factors, parent and family factors, and school climate. We found directional qualities and overlapping relationships in the data, partially confirming our hypotheses. Implications for Research and Practice: We offer specific recommendations for practical ways in which school administrators might systemically address teacher stress based on the five categories of stressors reported by participants. We also suggest means of conducting action research to measure the effects of implemented suggestions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Assessment of locomotion through simple tests such as timed up and go (TUG) or walking trials can provide valuable information for the evaluation of treatment and the early diagnosis of people with Parkinson's disease (PD). Common methods used in clinics are either based on complex motion laboratory settings or simple timing outcomes using stop watches. The goal of this paper is to present an innovative technology based on wearable sensors on-shoe and processing algorithm, which provides outcome measures characterizing PD motor symptoms during TUG and gait tests. Our results on ten PD patients and ten age-matched elderly subjects indicate an accuracy ± precision of 2.8 ± 2.4 cm/s and 1.3 ± 3.0 cm for stride velocity and stride length estimation compared to optical motion capture, with the advantage of being practical to use in home or clinics without any discomfort for the subject. In addition, the use of novel spatio-temporal parameters, including turning, swing width, path length, and their intercycle variability, was also validated and showed interesting tendencies for discriminating patients in ON and OFF states and control subjects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[cat] En aquest treball extenem les reformes lineals introduïdes per Pfähler (1984) al cas d’impostos duals. Estudiem l’efecte relatiu que els retalls lineals duals d’un impost dual tenen sobre la distribució de la desigualtat -es pot fer un estudi simètric per al cas d’augments d’impostos-. Tambe introduïm mesures del grau de progressivitat d’impostos duals i mostrem que estan connectades amb el criteri de dominació de Lorenz. Addicionalment, estudiem l’elasticitat de la càrrega fiscal de cadascuna de les reformes proposades. Finalment, gràcies a un model de microsimulació i una gran base de dades que conté informació sobre l’IRPF espanyol de l’any 2004, 1) comparem l’efecte que diferents reformes tindrien sobre l’impost dual espanyol i 2) estudiem quina redistribució de la riquesa va suposar la reforma dual de l’IRPF (Llei ’35/2006’) respecte l’anterior impost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a numerical method for spectroscopic ellipsometry of thick transparent films. When an analytical expression for the dispersion of the refractive index (which contains several unknown coefficients) is assumed, the procedure is based on fitting the coefficients at a fixed thickness. Then the thickness is varied within a range (according to its approximate value). The final result given by our method is as follows: The sample thickness is considered to be the one that gives the best fitting. The refractive index is defined by the coefficients obtained for this thickness.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Le terme de « myocardite » est utilisé pour décrire une inflammation non spécifique du muscle cardiaque. Cette inflammation, qui est le plus souvent d'origine infectieuse, peut également être d'origine toxique ou immunologique. Elle peut être associée ou non à une péricardite. Le tableau clinique est habituellement bénin, mais la myocardite peut se compliquer d'une mort subite ou d'un choc cardiogénique rapidement progressif nécessitant une assistance circulatoire ou une transplantation. Dans près de 25% des cas, l'inflammation peut devenir chronique et le cours clinique peut alors évoluer vers celui d'une insuffisance cardiaque globale sur cardiomyopathie dilatée. Après avoir présenté un cas, nous passons en revue les aspects épidémiologiques, diagnostiques et thérapeutiques de cette entité pour laquelle les données fondées sur les preuves sont étonnamment peu nombreuses. Ces éléments justifient le bien-fondé de l'attitude pragmatique qui est généralement adoptée.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[cat] En aquest treball extenem les reformes lineals introduïdes per Pfähler (1984) al cas d’impostos duals. Estudiem l’efecte relatiu que els retalls lineals duals d’un impost dual tenen sobre la distribució de la desigualtat -es pot fer un estudi simètric per al cas d’augments d’impostos-. Tambe introduïm mesures del grau de progressivitat d’impostos duals i mostrem que estan connectades amb el criteri de dominació de Lorenz. Addicionalment, estudiem l’elasticitat de la càrrega fiscal de cadascuna de les reformes proposades. Finalment, gràcies a un model de microsimulació i una gran base de dades que conté informació sobre l’IRPF espanyol de l’any 2004, 1) comparem l’efecte que diferents reformes tindrien sobre l’impost dual espanyol i 2) estudiem quina redistribució de la riquesa va suposar la reforma dual de l’IRPF (Llei ’35/2006’) respecte l’anterior impost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In addition to the Fair Housing Act of 1968, other legislation has expanded protection from discrimination for individuals with disabilities, including the Rehabilitation Act of 1973 and the Americans with Disabilities Act of 1990. Notably, the Fair Housing Amendments Act (FHAA), signed into law by Ronald Reagan in 1988, expanded equal housing protection to individuals with disabilities. The legislative history behind the 1988 Amendments notes that one aim of the law was to address both purposeful discrimination as well as what is sometimes unintentional discrimination caused by the design and construction of inaccessible housing.