799 resultados para Tuning algorithm
Resumo:
Summary Background: We previously derived a clinical prognostic algorithm to identify patients with pulmonary embolism (PE) who are at low-risk of short-term mortality who could be safely discharged early or treated entirely in an outpatient setting. Objectives: To externally validate the clinical prognostic algorithm in an independent patient sample. Methods: We validated the algorithm in 983 consecutive patients prospectively diagnosed with PE at an emergency department of a university hospital. Patients with none of the algorithm's 10 prognostic variables (age >/= 70 years, cancer, heart failure, chronic lung disease, chronic renal disease, cerebrovascular disease, pulse >/= 110/min., systolic blood pressure < 100 mm Hg, oxygen saturation < 90%, and altered mental status) at baseline were defined as low-risk. We compared 30-day overall mortality among low-risk patients based on the algorithm between the validation and the original derivation sample. We also assessed the rate of PE-related and bleeding-related mortality among low-risk patients. Results: Overall, the algorithm classified 16.3% of patients with PE as low-risk. Mortality at 30 days was 1.9% among low-risk patients and did not differ between the validation and the original derivation sample. Among low-risk patients, only 0.6% died from definite or possible PE, and 0% died from bleeding. Conclusions: This study validates an easy-to-use, clinical prognostic algorithm for PE that accurately identifies patients with PE who are at low-risk of short-term mortality. Low-risk patients based on our algorithm are potential candidates for less costly outpatient treatment.
Resumo:
The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.
Resumo:
Hepatitis A virus (HAV), the prototype of genus Hepatovirus, has several unique biological characteristics that distinguish it from other members of the Picornaviridae family. Among these, the need for an intact eIF4G factor for the initiation of translation results in an inability to shut down host protein synthesis by a mechanism similar to that of other picornaviruses. Consequently, HAV must inefficiently compete for the cellular translational machinery and this may explain its poor growth in cell culture. In this context of virus/cell competition, HAV has strategically adopted a naturally highly deoptimized codon usage with respect to that of its cellular host. With the aim to optimize its codon usage the virus was adapted to propagate in cells with impaired protein synthesis, in order to make tRNA pools more available for the virus. A significant loss of fitness was the immediate response to the adaptation process that was, however, later on recovered and more associated to a re-deoptimization rather than to an optimization of the codon usage specifically in the capsid coding region. These results exclude translation selection and instead suggest fine-tuning translation kinetics selection as the underlying mechanism of the codon usage bias in this specific genome region. Additionally, the results provide clear evidence of the Red Queen dynamics of evolution since the virus has very much evolved to re-adapt its codon usage to the environmental cellular changing conditions in order to recover the original fitness.
Resumo:
Arrays of vertically aligned ZnO:Cl/ZnO core-shell nanowires were used to demonstrate that the control of the coaxial doping profile in homojunction nanostructures can improve their surface charge carrier transfer while conserving potentially excellent transport properties. It is experimentally shown that the presence of a ZnO shell enhances the photoelectrochemical properties of ZnO:Cl nanowires up to a factor 5. Likewise, the ZnO shell promotes the visible photoluminescence band in highly conducting ZnO:Cl nanowires. These lines of evidence are associated with the increase of the nanowires" surface depletion layer
Resumo:
BACKGROUND: Notch signaling regulates multiple differentiation processes and cell fate decisions during both invertebrate and vertebrate development. Numb encodes an intracellular protein that was shown in Drosophila to antagonize Notch signaling at binary cell fate decisions of certain cell lineages. Although overexpression experiments suggested that Numb might also antagonize some Notch activity in vertebrates, the developmental processes in which Numb is involved remained elusive. RESULTS: We generated mice with a homozygous inactivation of Numb. These mice died before embryonic day E11.5, probably because of defects in angiogenic remodeling and placental dysfunction. Mutant embryos had an open anterior neural tube and impaired neuronal differentiation within the developing cranial central nervous system (CNS). In the developing spinal cord, the number of differentiated motoneurons was reduced. Within the peripheral nervous system (PNS), ganglia of cranial sensory neurons were formed. Trunk neural crest cells migrated and differentiated into sympathetic neurons. In contrast, a selective differentiation anomaly was observed in dorsal root ganglia, where neural crest--derived progenitor cells had migrated normally to form ganglionic structures, but failed to differentiate into sensory neurons. CONCLUSIONS: Mouse Numb is involved in multiple developmental processes and required for cell fate tuning in a variety of lineages. In the nervous system, Numb is required for the generation of a large subset of neuronal lineages. The restricted requirement of Numb during neural development in the mouse suggests that in some neuronal lineages, Notch signaling may be regulated independently of Numb.
Resumo:
We consider stochastic partial differential equations with multiplicative noise. We derive an algorithm for the computer simulation of these equations. The algorithm is applied to study domain growth of a model with a conserved order parameter. The numerical results corroborate previous analytical predictions obtained by linear analysis.
Resumo:
We apply majorization theory to study the quantum algorithms known so far and find that there is a majorization principle underlying the way they operate. Grover's algorithm is a neat instance of this principle where majorization works step by step until the optimal target state is found. Extensions of this situation are also found in algorithms based in quantum adiabatic evolution and the family of quantum phase-estimation algorithms, including Shor's algorithm. We state that in quantum algorithms the time arrow is a majorization arrow.
Resumo:
The paper deals with the development and application of the generic methodology for automatic processing (mapping and classification) of environmental data. General Regression Neural Network (GRNN) is considered in detail and is proposed as an efficient tool to solve the problem of spatial data mapping (regression). The Probabilistic Neural Network (PNN) is considered as an automatic tool for spatial classifications. The automatic tuning of isotropic and anisotropic GRNN/PNN models using cross-validation procedure is presented. Results are compared with the k-Nearest-Neighbours (k-NN) interpolation algorithm using independent validation data set. Real case studies are based on decision-oriented mapping and classification of radioactively contaminated territories.
Resumo:
We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.
Resumo:
We present a numerical method for spectroscopic ellipsometry of thick transparent films. When an analytical expression for the dispersion of the refractive index (which contains several unknown coefficients) is assumed, the procedure is based on fitting the coefficients at a fixed thickness. Then the thickness is varied within a range (according to its approximate value). The final result given by our method is as follows: The sample thickness is considered to be the one that gives the best fitting. The refractive index is defined by the coefficients obtained for this thickness.
Resumo:
We analyze the emergence of synchronization in a population of moving integrate-and-fire oscillators. Oscillators, while moving on a plane, interact with their nearest neighbor upon firing time. We discover a nonmonotonic dependence of the synchronization time on the velocity of the agents. Moreover, we find that mechanisms that drive synchronization are different for different dynamical regimes. We report the extreme situation where an interplay between the time scales involved in the dynamical processes completely inhibits the achievement of a coherent state. We also provide estimators for the transitions between the different regimes.
Resumo:
Las competencias se han convertido en la piedra angular de la reforma de los sistemas educativos; también en la enseñanza universitaria. En este último caso, el proyecto Tuning ha desempeñado un papel de extraordinaria relevancia a nivel internacional y ha suscitado tantas adhesiones inquebrantables como críticas demoledoras. Este artículo parte de una lectura bien pegada a los textos del proyecto y persigue un examen crítico de su noción de competencia. Para ello, se prestará atención a los aspectos más destacados que afectan a la naturaleza y a los elementos constitutivos de las competencias ¿en ese plural recurrente en Tuning¿ en relación con el proceso educativo y el educando. También será objeto de consideración el contexto político y económico en el que surge el proyecto, a partir del cual se establecen sus presupuestos, su finalidad y sus objetivos. Todo esto será fundamental para calibrar el impacto y la evolución del proyecto en el proceso de reforma de la educación superior.
Resumo:
Hepatitis A virus (HAV), the prototype of genus Hepatovirus, has several unique biological characteristics that distinguish it from other members of the Picornaviridae family. Among these, the need for an intact eIF4G factor for the initiation of translation results in an inability to shut down host protein synthesis by a mechanism similar to that of other picornaviruses. Consequently, HAV must inefficiently compete for the cellular translational machinery and this may explain its poor growth in cell culture. In this context of virus/cell competition, HAV has strategically adopted a naturally highly deoptimized codon usage with respect to that of its cellular host. With the aim to optimize its codon usage the virus was adapted to propagate in cells with impaired protein synthesis, in order to make tRNA pools more available for the virus. A significant loss of fitness was the immediate response to the adaptation process that was, however, later on recovered and more associated to a re-deoptimization rather than to an optimization of the codon usage specifically in the capsid coding region. These results exclude translation selection and instead suggest fine-tuning translation kinetics selection as the underlying mechanism of the codon usage bias in this specific genome region. Additionally, the results provide clear evidence of the Red Queen dynamics of evolution since the virus has very much evolved to re-adapt its codon usage to the environmental cellular changing conditions in order to recover the original fitness.
Resumo:
The artificial dsRNA polyriboinosinic acid-polyribocytidylic acid, poly(I:C), is a potent adjuvant candidate for vaccination, as it strongly drives cell-mediated immunity. However, because of its effects on non-immune bystander cells, poly(I:C) administration may bear danger for the development of autoimmune diseases. Thus poly(I:C) should be applied in the lowest dose possible. We investigated microspheres carrying surface-assembled poly(I:C) as a two-in-one adjuvant formulation to stimulate maturation of monocyte-derived dendritic cells (MoDCs). Negatively charged polystyrene microspheres were equipped with a poly(ethylene glycol) corona through electrostatically driven surface assembly of a library of polycationic poly(l-lysine)-graft-poly(ethylene glycol) copolymers, PLL-g-PEG. Stable surface assembly of poly(I:C) was achieved by incubation of polymer-coated microspheres in an aqueous poly(I:C) solution. Surface-assembled poly(I:C) exhibited a strongly enhanced efficacy to stimulate maturation of MoDCs by up to two orders of magnitude, as compared to free poly(I:C). Multiple phagocytosis events were the key factor to enhance the efficacy. The cytokine secretion pattern of MoDCs after exposure to surface-assembled poly(I:C) differed from that of free poly(I:C), while their ability to stimulate T cell proliferation was similar. Overall, phagocytic signaling plays an important role in defining the resulting immune response to such two-in-one adjuvant formulations.
Resumo:
The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). The types of problems related to the occurrence of freezing precipitation range from simple traffic delays to major accidents that involve fatalities. Freezing drizzle can also lead to economic impacts in communities with lost work hours, vehicular damage, and downed power lines. There are means for transportation agencies to perform preventive and reactive treatments to roadways, but freezing drizzle can be difficult to forecast accurately or even detect as weather radar and surface observation networks poorly observe these conditions. The detection of freezing precipitation is problematic and requires special instrumentation and analysis. The Federal Aviation Administration (FAA) development of aircraft anti-icing and deicing technologies has led to the development of a freezing drizzle algorithm that utilizes air temperature data and a specialized sensor capable of detecting ice accretion. However, at present, roadway ESSs are not capable of reporting freezing drizzle. This study investigates the use of the methods developed for the FAA and the National Weather Service (NWS) within a roadway environment to detect the occurrence of freezing drizzle using a combination of icing detection equipment and available ESS sensors. The work performed in this study incorporated the algorithm developed initially and further modified for work with the FAA for aircraft icing. The freezing drizzle algorithm developed for the FAA was applied using data from standard roadway ESSs. The work performed in this study lays the foundation for addressing the central question of interest to winter maintenance professionals as to whether it is possible to use roadside freezing precipitation detection (e.g., icing detection) sensors to determine the occurrence of pavement icing during freezing precipitation events and the rates at which this occurs.