931 resultados para k-Means algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: We investigated whether the INTERMED, a generic instrument for assessing biopsychosocial case complexity and direct care, identifies organ transplant patients at risk of unfavourable post-transplant development by comparing it to the Transplant Evaluation Rating Scale (TERS), the established measure for pretransplant psychosocial evaluation. METHOD: One hundred nineteen kidney, liver, and heart transplant candidates were evaluated using the INTERMED, TERS, SF-36, EuroQol, Montgomery-Åsberg Depression Rating Scale (MADRS), and Hospital Anxiety & Depression Scale (HADS). RESULTS: We found significant relationships between the INTERMED and the TERS scores. The INTERMED highly correlated with the HADS,MADRS, and mental and physical health scores of the SF-36 Health Survey. CONCLUSIONS: The results demonstrate the validity and usefulness of the INTERMED instrument for pretransplant evaluation. Furthermore, our findings demonstrate the different qualities of INTERMED and TERS in clinical practice. The advantages of the psychiatric focus of the TERS and the biopsychosocial perspective of the INTERMED are discussed in the context of current literature on integrated care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Polochic and Motagua faults define the active plate boundary between the North American and Caribbean plates in central Guatemala. A splay of the Polochic Fault traverses the rapidly growing city of San Miguel Uspantan that is periodically affected by destructive earthquakes. This fault splay was located using a 2D electrical resistivity tomography (ERT) survey that also characterized the fault damage zone and evaluated the thickness and nature of recent deposits upon which most of the city is built. ERT images show the fault as a similar to 50 m wide, near-vertical low-resistivity anomaly, bounded within a few meters by high resistivity anomalies. Forward modeling reproduces the key aspects of the observed electrical resistivity data with remarkable fidelity thus defining the overall location, geometry, and internal structure of the fault zone as well as the affected lithologies. Our results indicate that the city is constructed on a similar to 20 m thick surficial layer consisting of poorly consolidated, highly porous, water-logged pumice. This soft layer is likely to amplify seismic waves and to liquefy upon moderate to strong ground shaking. The electrical conductivity as well as the major element chemistry of the groundwater provides evidence to suggest that the local aquifer might, at least in part, be fed by water rising along the fault. Therefore, the potential threat posed by this fault splay may not be limited to its seismic activity per se, but could be compounded its potential propensity to enhance seismic site effects by injecting water into the soft surficial sediments. The results of this study provide the basis for a rigorous analysis of seismic hazard and sustainable development of San Miguel Uspantan and illustrate the potential of ERT surveying for paleoseismic studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary goal of this project is to demonstrate the accuracy and utility of a freezing drizzle algorithm that can be implemented on roadway environmental sensing systems (ESSs). The types of problems related to the occurrence of freezing precipitation range from simple traffic delays to major accidents that involve fatalities. Freezing drizzle can also lead to economic impacts in communities with lost work hours, vehicular damage, and downed power lines. There are means for transportation agencies to perform preventive and reactive treatments to roadways, but freezing drizzle can be difficult to forecast accurately or even detect as weather radar and surface observation networks poorly observe these conditions. The detection of freezing precipitation is problematic and requires special instrumentation and analysis. The Federal Aviation Administration (FAA) development of aircraft anti-icing and deicing technologies has led to the development of a freezing drizzle algorithm that utilizes air temperature data and a specialized sensor capable of detecting ice accretion. However, at present, roadway ESSs are not capable of reporting freezing drizzle. This study investigates the use of the methods developed for the FAA and the National Weather Service (NWS) within a roadway environment to detect the occurrence of freezing drizzle using a combination of icing detection equipment and available ESS sensors. The work performed in this study incorporated the algorithm developed initially and further modified for work with the FAA for aircraft icing. The freezing drizzle algorithm developed for the FAA was applied using data from standard roadway ESSs. The work performed in this study lays the foundation for addressing the central question of interest to winter maintenance professionals as to whether it is possible to use roadside freezing precipitation detection (e.g., icing detection) sensors to determine the occurrence of pavement icing during freezing precipitation events and the rates at which this occurs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy, Total Variation (TV)- based energies and more recently non-local means. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm or fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n2) and O(1/√ε), while existing techniques are in O(1/n2) and O(1/√ε). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Active screening by mobile teams is considered the best method for detecting human African trypanosomiasis (HAT) caused by Trypanosoma brucei gambiense but the current funding context in many post-conflict countries limits this approach. As an alternative, non-specialist health care workers (HCWs) in peripheral health facilities could be trained to identify potential cases who need testing based on their symptoms. We explored the predictive value of syndromic referral algorithms to identify symptomatic cases of HAT among a treatment-seeking population in Nimule, South Sudan. METHODOLOGY/PRINCIPAL FINDINGS: Symptom data from 462 patients (27 cases) presenting for a HAT test via passive screening over a 7 month period were collected to construct and evaluate over 14,000 four item syndromic algorithms considered simple enough to be used by peripheral HCWs. For comparison, algorithms developed in other settings were also tested on our data, and a panel of expert HAT clinicians were asked to make referral decisions based on the symptom dataset. The best performing algorithms consisted of three core symptoms (sleep problems, neurological problems and weight loss), with or without a history of oedema, cervical adenopathy or proximity to livestock. They had a sensitivity of 88.9-92.6%, a negative predictive value of up to 98.8% and a positive predictive value in this context of 8.4-8.7%. In terms of sensitivity, these out-performed more complex algorithms identified in other studies, as well as the expert panel. The best-performing algorithm is predicted to identify about 9/10 treatment-seeking HAT cases, though only 1/10 patients referred would test positive. CONCLUSIONS/SIGNIFICANCE: In the absence of regular active screening, improving referrals of HAT patients through other means is essential. Systematic use of syndromic algorithms by peripheral HCWs has the potential to increase case detection and would increase their participation in HAT programmes. The algorithms proposed here, though promising, should be validated elsewhere.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulated-annealing-based conditional simulations provide a flexible means of quantitatively integrating diverse types of subsurface data. Although such techniques are being increasingly used in hydrocarbon reservoir characterization studies, their potential in environmental, engineering and hydrological investigations is still largely unexploited. Here, we introduce a novel simulated annealing (SA) algorithm geared towards the integration of high-resolution geophysical and hydrological data which, compared to more conventional approaches, provides significant advancements in the way that large-scale structural information in the geophysical data is accounted for. Model perturbations in the annealing procedure are made by drawing from a probability distribution for the target parameter conditioned to the geophysical data. This is the only place where geophysical information is utilized in our algorithm, which is in marked contrast to other approaches where model perturbations are made through the swapping of values in the simulation grid and agreement with soft data is enforced through a correlation coefficient constraint. Another major feature of our algorithm is the way in which available geostatistical information is utilized. Instead of constraining realizations to match a parametric target covariance model over a wide range of spatial lags, we constrain the realizations only at smaller lags where the available geophysical data cannot provide enough information. Thus we allow the larger-scale subsurface features resolved by the geophysical data to have much more due control on the output realizations. Further, since the only component of the SA objective function required in our approach is a covariance constraint at small lags, our method has improved convergence and computational efficiency over more traditional methods. Here, we present the results of applying our algorithm to the integration of porosity log and tomographic crosshole georadar data to generate stochastic realizations of the local-scale porosity structure. Our procedure is first tested on a synthetic data set, and then applied to data collected at the Boise Hydrogeophysical Research Site.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of the pseudopotential on both the structure and the self-diffusion of liquid rubidium at the melting point has been investigated by means of molecular-dynamics calculations. The model potential considered has been computed from the pseudopotential of Ashcroft, the dielectric function of Geldart and Vosko, and a Born-Mayer term. Four different values for the core radius which enters as input in the pseudopotential have been considered. In this way we have been able to observe and interpret the effect of this contribution on the properties of the liquid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Normally either the Güntelberg or Davies equation is used to predict activity coefficients of electrolytes in dilute solutions when no better equation is available. The validity of these equations and, additionally, of the parameter-free equations used in the Bates-Guggenheim convention and in the Pitzerformalism for activity coefficients were tested with experimentally determined activity coefficients of HCl, HBr, HI, LiCl, NaCl, KCl, RbCl, CsCl, NH4Cl, LiBr,NaBr and KBr in aqueous solutions at 298.15 K. The experimental activity coefficients of these electrolytes can be usually reproduced within experimental errorby means of a two-parameter equation of the Hückel type. The best Hückel equations were also determined for all electrolytes considered. The data used in the calculations of this study cover almost all reliable galvanic cell results available in the literature for the electrolytes considered. The results of the calculations reveal that the parameter-free activity coefficient equations can only beused for very dilute electrolyte solutions in thermodynamic studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Wiener system is a linear time-invariant filter, followed by an invertible nonlinear distortion. Assuming that the input signal is an independent and identically distributed (iid) sequence, we propose an algorithm for estimating the input signal only by observing the output of the Wiener system. The algorithm is based on minimizing the mutual information of the output samples, by means of a steepest descent gradient approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fetal MRI reconstruction aims at finding a high-resolution image given a small set of low-resolution images. It is usually modeled as an inverse problem where the regularization term plays a central role in the reconstruction quality. Literature has considered several regularization terms s.a. Dirichlet/Laplacian energy [1], Total Variation (TV)based energies [2,3] and more recently non-local means [4]. Although TV energies are quite attractive because of their ability in edge preservation, standard explicit steepest gradient techniques have been applied to optimize fetal-based TV energies. The main contribution of this work lies in the introduction of a well-posed TV algorithm from the point of view of convex optimization. Specifically, our proposed TV optimization algorithm for fetal reconstruction is optimal w.r.t. the asymptotic and iterative convergence speeds O(1/n(2)) and O(1/root epsilon), while existing techniques are in O(1/n) and O(1/epsilon). We apply our algorithm to (1) clinical newborn data, considered as ground truth, and (2) clinical fetal acquisitions. Our algorithm compares favorably with the literature in terms of speed and accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a general algorithm for the simulation of x-ray spectra emitted from targets of arbitrary composition bombarded with kilovolt electron beams. Electron and photon transport is simulated by means of the general-purpose Monte Carlo code PENELOPE, using the standard, detailed simulation scheme. Bremsstrahlung emission is described by using a recently proposed algorithm, in which the energy of emitted photons is sampled from numerical cross-section tables, while the angular distribution of the photons is represented by an analytical expression with parameters determined by fitting benchmark shape functions obtained from partial-wave calculations. Ionization of K and L shells by electron impact is accounted for by means of ionization cross sections calculated from the distorted-wave Born approximation. The relaxation of the excited atoms following the ionization of an inner shell, which proceeds through emission of characteristic x rays and Auger electrons, is simulated until all vacancies have migrated to M and outer shells. For comparison, measurements of x-ray emission spectra generated by 20 keV electrons impinging normally on multiple bulk targets of pure elements, which span the periodic system, have been performed using an electron microprobe. Simulation results are shown to be in close agreement with these measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work is to develop entrepreneurship in university technology side. The means of SMEs is made in collaboration with the research development and innovation projects. At the same time supporting and enabling the growth of SMEs, the internationalization of higher education and increase student’s employability in SMEs. The aim is to create new startup companies with the help of co-operation both SMEs and universities, especially assist companies find the successor generation of change by improving interaction between the parties. The new growth oriented entrepreneurial business creation with help of SMEs will be seen more business supporting and opening more opportunities. Portfolio Entrepreneurship is a form of the company’s growth, even the size of company does not change so much. Portfolio Entrepreneurship is an alternative for entrepreneurs which do have possibilities to expand, but they do like keep it as family business. Variety can be seen in expansion if the SMEs have common interest to do so. Co-made projects, between SMEs and universities, are seen to be significantly affecting the wellbeing of society as a whole, and eve international level. Co-operation is considered to be also affected the quality of learning and teachers’ professional development. higher education supports the change of the supports to innovation when it is done with SMEs. Research material has been collected by interviewing the experts, who have already distinguished themselves in their field. Problem focused interviews as method gave the ways how the university students and SMEs can proceed and promote on industrial sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prediction of variety composite means was shown to be feasible without diallel crossing the parental varieties. Thus, the predicted mean for a quantitative trait of a composite is given by: Yk = a1 sigmaVj + a2sigmaTj + a3 - a4, with coefficients a1 = (n - 2k)/k²(n - 2); a2 = 2n(k - 1)/k²(n - 2); a3 = n(k - 1)/k(n - 1)(n - 2); and a4 = n²(k - 1)/k(n - 1)(n - 2); summation is for j = 1 to k, where k is the size of the composite (number of parental varieties of a particular composite) and n is the total number of parent varieties. Vj is the mean of varieties and Tj is the mean of topcrosses (pool of varieties as tester), and and are the respective average values in the whole set. Yield data from a 7 x 7 variety diallel cross were used for the variety means and for the "simulated" topcross means to illustrate the proposed procedure. The proposed prediction procedure was as effective as the prediction based on Yk = - ( -)/k, where and refer to the mean of hybrids (F1) and parental varieties, respectively, in a variety diallel cross. It was also shown in the analysis of variance that the total sum of squares due to treatments (varieties and topcrosses) can be orthogonally partitioned following the reduced model Yjj’ = mu + ½(v j + v j’) + + h j+ h j’, thus making possible an F test for varieties, average heterosis and variety heterosis. Least square estimates of these effects are also given