906 resultados para Normalization-based optimization


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, kernel-based Machine Learning methods have gained great popularity in many data analysis and data mining fields: pattern recognition, biocomputing, speech and vision, engineering, remote sensing etc. The paper describes the use of kernel methods to approach the processing of large datasets from environmental monitoring networks. Several typical problems of the environmental sciences and their solutions provided by kernel-based methods are considered: classification of categorical data (soil type classification), mapping of environmental and pollution continuous information (pollution of soil by radionuclides), mapping with auxiliary information (climatic data from Aral Sea region). The promising developments, such as automatic emergency hot spot detection and monitoring network optimization are discussed as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the problem of scheduling a multiclass $M/M/m$ queue with Bernoulli feedback on $m$ parallel servers to minimize time-average linear holding costs. We analyze the performance of a heuristic priority-index rule, which extends Klimov's optimal solution to the single-server case: servers select preemptively customers with larger Klimov indices. We present closed-form suboptimality bounds (approximate optimality) for Klimov's rule, which imply that its suboptimality gap is uniformly bounded above with respect to (i) external arrival rates, as long as they stay within system capacity;and (ii) the number of servers. It follows that its relativesuboptimality gap vanishes in a heavy-traffic limit, as external arrival rates approach system capacity (heavy-traffic optimality). We obtain simpler expressions for the special no-feedback case, where the heuristic reduces to the classical $c \mu$ rule. Our analysis is based on comparing the expected cost of Klimov's ruleto the value of a strong linear programming (LP) relaxation of the system's region of achievable performance of mean queue lengths. In order to obtain this relaxation, we derive and exploit a new set ofwork decomposition laws for the parallel-server system. We further report on the results of a computational study on the quality of the $c \mu$ rule for parallel scheduling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Following the introduction of single-metal deposition (SMD), a simplified fingermark detection technique based on multimetal deposition, optimization studies were conducted. The different parameters of the original formula were tested and the results were evaluated based on the contrast and overall aspect of the enhanced fingermarks. The new formula for SMD was found based on the most optimized parameters. Interestingly, it was found that important variations from the base parameters did not significantly affect the outcome of the enhancement, thus demonstrating that SMD is a very robust technique. Finally, a comparison of the optimized SMD with multi-metal deposition (MMD) was carried out on different surfaces. It was demonstrated that SMD produces comparable results to MMD, thus validating the technique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Positron emission tomography with [18F] fluorodeoxyglucose (FDG-PET) plays a well-established role in assisting early detection of frontotemporal lobar degeneration (FTLD). Here, we examined the impact of intensity normalization to different reference areas on accuracy of FDG-PET to discriminate between patients with mild FTLD and healthy elderly subjects. FDG-PET was conducted at two centers using different acquisition protocols: 41 FTLD patients and 42 controls were studied at center 1, 11 FTLD patients and 13 controls were studied at center 2. All PET images were intensity normalized to the cerebellum, primary sensorimotor cortex (SMC), cerebral global mean (CGM), and a reference cluster with most preserved FDG uptake in the aforementioned patients group of center 1. Metabolic deficits in the patient group at center 1 appeared 1.5, 3.6, and 4.6 times greater in spatial extent, when tracer uptake was normalized to the reference cluster rather than to the cerebellum, SMC, and CGM, respectively. Logistic regression analyses based on normalized values from FTLD-typical regions showed that at center 1, cerebellar, SMC, CGM, and cluster normalizations differentiated patients from controls with accuracies of 86%, 76%, 75% and 90%, respectively. A similar order of effects was found at center 2. Cluster normalization leads to a significant increase of statistical power in detecting early FTLD-associated metabolic deficits. The established FTLD-specific cluster can be used to improve detection of FTLD on a single case basis at independent centers - a decisive step towards early diagnosis and prediction of FTLD syndromes enabling specific therapies in the future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine if the fixed-dose perindopril/indapamide combination (Per/Ind) normalizes blood pressure (BP) in the same fraction of hypertensive patients when treated in everyday practice or in controlled trials. METHODS: In this prospective trial, 17 938 hypertensive patients were treated with Per 2 mg/Ind 0.625 mg for 3-6 months. In Group 1 Per/Ind was initiated in newly diagnosed patients (n = 7032); in Group 2 Per/Ind replaced previous therapy in patients already treated but having either their BP still uncontrolled or experiencing side-effects (n = 7423); in Group 3 Per/Ind was added to previous treatment in patients with persistently high BP (n = 3483). BP was considered normalized when < or = 140/90 mm Hg. A multivariate analysis for predictors of BP normalization was performed. RESULTS: Subjects were on average 62 years old and had a baseline BP of 162.3/93.6 mm Hg. After treatment with Per/Ind, BP normalization was reached in 69.6% of patients in the Initiation group, 67.5% in the Replacement Group, and 67.4% in the Add-on Group (where patients were more frequently at risk, diabetic, or with target organ damage). Mean decreases in systolic BP of 22.8 mm Hg and in diastolic BP of 12.4 mm Hg were recorded. CONCLUSIONS: This trial was established to reflect everyday clinical practice, and a treatment strategy based on the Per/Ind combination, administered as initial, replacement, or add-on therapy, led to normalization rates that were superior to those observed in Europe in routine practice. These results support recent hypertension guidelines which encourage the use of combination therapy in the management of arterial hypertension.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A systematic method to improve the quality (Q) factor of RF integrated inductors is presented in this paper. The proposed method is based on the layout optimization to minimize the series resistance of the inductor coil, taking into account both ohmic losses, due to conduction currents, and magnetically induced losses, due to eddy currents. The technique is particularly useful when applied to inductors in which the fabrication process includes integration substrate removal. However, it is also applicable to inductors on low-loss substrates. The method optimizes the width of the metal strip for each turn of the inductor coil, leading to a variable strip-width layout. The optimization procedure has been successfully applied to the design of square spiral inductors in a silicon-based multichip-module technology, complemented with silicon micromachining postprocessing. The obtained experimental results corroborate the validity of the proposed method. A Q factor of about 17 have been obtained for a 35-nH inductor at 1.5 GHz, with Q values higher than 40 predicted for a 20-nH inductor working at 3.5 GHz. The latter is up to a 60% better than the best results for a single strip-width inductor working at the same frequency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term preservation of bioreporter bacteria is essential for the functioning of cell-based detection devices, particularly when field application, e.g., in developing countries, is intended. We varied the culture conditions (i.e., the NaCl content of the medium), storage protection media, and preservation methods (vacuum drying vs. encapsulation gels remaining hydrated) in order to achieve optimal preservation of the activity of As (III) bioreporter bacteria during up to 12 weeks of storage at 4 degrees C. The presence of 2% sodium chloride during the cultivation improved the response intensity of some bioreporters upon reconstitution, particularly of those that had been dried and stored in the presence of sucrose or trehalose and 10% gelatin. The most satisfying, stable response to arsenite after 12 weeks storage was obtained with cells that had been dried in the presence of 34% trehalose and 1.5% polyvinylpyrrolidone. Amendments of peptone, meat extract, sodium ascorbate, and sodium glutamate preserved the bioreporter activity only for the first 2 weeks, but not during long-term storage. Only short-term stability was also achieved when bioreporter bacteria were encapsulated in gels remaining hydrated during storage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Because of the increase in workplace automation and the diversification of industrial processes, workplaces have become more and more complex. The classical approaches used to address workplace hazard concerns, such as checklists or sequence models, are, therefore, of limited use in such complex systems. Moreover, because of the multifaceted nature of workplaces, the use of single-oriented methods, such as AEA (man oriented), FMEA (system oriented), or HAZOP (process oriented), is not satisfactory. The use of a dynamic modeling approach in order to allow multiple-oriented analyses may constitute an alternative to overcome this limitation. The qualitative modeling aspects of the MORM (man-machine occupational risk modeling) model are discussed in this article. The model, realized on an object-oriented Petri net tool (CO-OPN), has been developed to simulate and analyze industrial processes in an OH&S perspective. The industrial process is modeled as a set of interconnected subnets (state spaces), which describe its constitutive machines. Process-related factors are introduced, in an explicit way, through machine interconnections and flow properties. While man-machine interactions are modeled as triggering events for the state spaces of the machines, the CREAM cognitive behavior model is used in order to establish the relevant triggering events. In the CO-OPN formalism, the model is expressed as a set of interconnected CO-OPN objects defined over data types expressing the measure attached to the flow of entities transiting through the machines. Constraints on the measures assigned to these entities are used to determine the state changes in each machine. Interconnecting machines implies the composition of such flow and consequently the interconnection of the measure constraints. This is reflected by the construction of constraint enrichment hierarchies, which can be used for simulation and analysis optimization in a clear mathematical framework. The use of Petri nets to perform multiple-oriented analysis opens perspectives in the field of industrial risk management. It may significantly reduce the duration of the assessment process. But, most of all, it opens perspectives in the field of risk comparisons and integrated risk management. Moreover, because of the generic nature of the model and tool used, the same concepts and patterns may be used to model a wide range of systems and application fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous Iowa DOT sponsored research has shown that some Class C fly ashes are ementitious (because calcium is combined as calcium aluminates) while other Class C ashes containing similar amounts of elemental calcium are not (1). Fly ashes from modern power plants in Iowa contain significant amounts of calcium in their glassy phases, regardless of their cementitious properties. The present research was based on these findings and on the hyphothesis that: attack of the amorphous phase of high calcium fly ash could be initiated with trace additives, thus making calcium available for formation of useful calcium-silicate cements. Phase I research was devoted to finding potential additives through a screening process; the likely chemicals were tested with fly ashes representative of the cementitious and non-cementitious ashes available in the state. Ammonium phosphate, a fertilizer, was found to produce 3,600 psi cement with cementitious Neal #4 fly ash; this strength is roughly equivalent to that of portland cement, but at about one-third the cost. Neal #2 fly ash, a slightly cementitious Class C, was found to respond best with ammonium nitrate; through the additive, a near-zero strength material was transformed into a 1,200 psi cement. The second research phase was directed to optimimizing trace additive concentrations, defining the behavior of the resulting cements, evaluating more comprehensively the fly ashes available in Iowa, and explaining the cement formation mechanisms of the most promising trace additives. X-ray diffraction data demonstrate that both amorphous and crystalline hydrates of chemically enhanced fly ash differ from those of unaltered fly ash hydrates. Calciumaluminum- silicate hydrates were formed, rather than the expected (and hypothesized) calcium-silicate hydrates. These new reaction products explain the observed strength enhancement. The final phase concentrated on laboratory application of the chemically-enhanced fly ash cements to road base stabilization. Emphasis was placed on use of marginal aggregates, such as limestone crusher fines and unprocessed blow sand. The nature of the chemically modified fly ash cements led to an evaluation of fine grained soil stabilization where a wide range of materials, defined by plasticity index, could be stabilized. Parameters used for evaluation included strength, compaction requirements, set time, and frost resistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The function of DNA-binding proteins is controlled not just by their abundance, but mainly at the level of their activity in terms of their interactions with DNA and protein targets. Moreover, the affinity of such transcription factors to their target sequences is often controlled by co-factors and/or modifications that are not easily assessed from biological samples. Here, we describe a scalable method for monitoring protein-DNA interactions on a microarray surface. This approach was designed to determine the DNA-binding activity of proteins in crude cell extracts, complementing conventional expression profiling arrays. Enzymatic labeling of DNA enables direct normalization of the protein binding to the microarray, allowing the estimation of relative binding affinities. Using DNA sequences covering a range of affinities, we show that the new microarray-based method yields binding strength estimates similar to low-throughput gel mobility-shift assays. The microarray is also of high sensitivity, as it allows the detection of a rare DNA-binding protein from breast cancer cells, the human tumor suppressor AP-2. This approach thus mediates precise and robust assessment of the activity of DNA-binding proteins and takes present DNA-binding assays to a high throughput level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ski resorts are deploying more and more systems of artificial snow. These tools are necessary to ensure an important economic activity for the high alpine valleys. However, artificial snow raises important environmental issues that can be reduced by an optimization of its production. This paper presents a software prototype based on artificial intelligence to help ski resorts better manage their snowpack. It combines on one hand a General Neural Network for the analysis of the snow cover and the spatial prediction, with on the other hand a multiagent simulation of skiers for the analysis of the spatial impact of ski practice. The prototype has been tested on the ski resort of Verbier (Switzerland).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One major methodological problem in analysis of sequence data is the determination of costs from which distances between sequences are derived. Although this problem is currently not optimally dealt with in the social sciences, it has some similarity with problems that have been solved in bioinformatics for three decades. In this article, the authors propose an optimization of substitution and deletion/insertion costs based on computational methods. The authors provide an empirical way of determining costs for cases, frequent in the social sciences, in which theory does not clearly promote one cost scheme over another. Using three distinct data sets, the authors tested the distances and cluster solutions produced by the new cost scheme in comparison with solutions based on cost schemes associated with other research strategies. The proposed method performs well compared with other cost-setting strategies, while it alleviates the justification problem of cost schemes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Intensity-modulated radiotherapy (IMRT) treatment plan verification by comparison with measured data requires having access to the linear accelerator and is time consuming. In this paper, we propose a method for monitor unit (MU) calculation and plan comparison for step and shoot IMRT based on the Monte Carlo code EGSnrc/BEAMnrc. The beamlets of an IMRT treatment plan are individually simulated using Monte Carlo and converted into absorbed dose to water per MU. The dose of the whole treatment can be expressed through a linear matrix equation of the MU and dose per MU of every beamlet. Due to the positivity of the absorbed dose and MU values, this equation is solved for the MU values using a non-negative least-squares fit optimization algorithm (NNLS). The Monte Carlo plan is formed by multiplying the Monte Carlo absorbed dose to water per MU with the Monte Carlo/NNLS MU. Several treatment plan localizations calculated with a commercial treatment planning system (TPS) are compared with the proposed method for validation. The Monte Carlo/NNLS MUs are close to the ones calculated by the TPS and lead to a treatment dose distribution which is clinically equivalent to the one calculated by the TPS. This procedure can be used as an IMRT QA and further development could allow this technique to be used for other radiotherapy techniques like tomotherapy or volumetric modulated arc therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.