911 resultados para Multivariate optimization problem
Resumo:
Intelligence (IQ) can be seen as the efficiency of mental processes or cognition, as can basic information processing (IP) tasks like those used in our ongoing Memory, Attention and Problem Solving (MAPS) study. Measures of IQ and IP are correlated and both have a genetic component, so we are studying how the genetic variance in IQ is related to the genetic variance in IP. We measured intelligence with five subscales of the Multidimensional Aptitude Battery (MAB). The IP tasks included four variants of choice reaction time (CRT) and a visual inspection time (IT). The influence of genetic factors on the variances in each of the IQ, IP, and IT tasks was investigated in 250 identical and nonidentical twin pairs aged 16 years. For a subset of 50 pairs we have test–retest data that allow us to estimate the stability of the measures. MX was used for a multivariate genetic analysis that addresses whether the variance in IQ and IP measures is possibly mediated by common genetic factors. Analyses that show the modeled genetic and environmental influences on these measures of cognitive efficiency will be presented and their relevance to ideas on intelligence will be discussed.
Resumo:
As part of a large ongoing project, the Memory, Attention and Problem Solving (MAPS) study, we investigated whether genetic variability explains some of the variance in psychophysiological correlates of brain function, namely, the P3 and SW components of event-related potentials (ERPs). These ERP measures are minute time recordings of brain processes and, because they reflect fundamental cognitive processing, provide a unique window on the millisecondto- millisecond transactions that occur at the cognitive level and taking place in the human brain. The extent to which the variance in P3 and SW components is influenced by genetic factors was examined in 350 identical and nonidentical twin pairs aged 16 years. ERPs were recorded from 15 scalp electrodes during the performance of a visuospatial delayed response task that engages working memory. Multivariate genetic analyses using MX were used to estimate genetic and environmental influences on individual differences in brain functioning and to identify putative genetic factors common to the ERP measures and psychometric IQ. For each of the ERP measures, correlation among electrode sites was high, a spatial pattern was evident, and a large part of the genetic variation in the ERPs appeared to be mediated by a common genetic factor. Moderate within-pair concordance in MZ pairs was found for all ERP measures, with higher correlations found for P3 than SW, and the MZ twin pair correlations were approximately twice the DZ correlations, suggesting a genetic influence. Correlations between ERP measures and psychometric IQ were found and, although moderately low, were evident across electrode site. The analyses show that the ERP components, P3 and SW, are promising phenotypes of the neuroelectrical activity of the brain and have the potential to be used in linkage and association analysis in the search for QTLs influencing cognitive function.
Resumo:
Purpose. To conduct a controlled trial of traditional and problem-based learning (PBL) methods of teaching epidemiology. Method. All second-year medical students (n = 136) at The University of Western Australia Medical School were offered the chance to participate in a randomized controlled trial of teaching methods fur an epidemiology course. Students who consented to participate (n = 80) were randomly assigned to either a PBL or a traditional course. Students who did not consent or did not return the consent form (n = 56) were assigned to the traditional course, Students in both streams took identical quizzes and exams. These scores, a collection of semi-quantitative feedback from all students, and a qualitative analysis of interviews with a convenience sample of six students from each stream were compared. Results. There was no significant difference in performances on quizzes or exams between PBL and traditional students. Students using PBL reported a stronger grasp of epidemiologic principles, enjoyed working with a group, and, at the end of the course, were more enthusiastic about epidemiology and its professional relevance to them than were students in the traditional course. PBL students worked more steadily during the semester but spent only marginally more time on the epidemiology course overall. Interviews corroborated these findings. Non-consenting students were older (p < 0.02) and more likely to come from non-English-speaking backgrounds (p < 0.005). Conclusions. PBL provides an academically equivalent but personally far richer learning experience. The adoption of PBL approaches to medical education makes it important to study whether PBL presents particular challenges for students whose first language is not the language of instruction.
Resumo:
Surge flow phenomena. e.g.. as a consequence of a dam failure or a flash flood, represent free boundary problems. ne extending computational domain together with the discontinuities involved renders their numerical solution a cumbersome procedure. This contribution proposes an analytical solution to the problem, It is based on the slightly modified zero-inertia (ZI) differential equations for nonprismatic channels and uses exclusively physical parameters. Employing the concept of a momentum-representative cross section of the moving water body together with a specific relationship for describing the cross sectional geometry leads, after considerable mathematical calculus. to the analytical solution. The hydrodynamic analytical model is free of numerical troubles, easy to run, computationally efficient. and fully satisfies the law of volume conservation. In a first test series, the hydrodynamic analytical ZI model compares very favorably with a full hydrodynamic numerical model in respect to published results of surge flow simulations in different types of prismatic channels. In order to extend these considerations to natural rivers, the accuracy of the analytical model in describing an irregular cross section is investigated and tested successfully. A sensitivity and error analysis reveals the important impact of the hydraulic radius on the velocity of the surge, and this underlines the importance of an adequate description of the topography, The new approach is finally applied to simulate a surge propagating down the irregularly shaped Isar Valley in the Bavarian Alps after a hypothetical dam failure. The straightforward and fully stable computation of the flood hydrograph along the Isar Valley clearly reflects the impact of the strongly varying topographic characteristics on the How phenomenon. Apart from treating surge flow phenomena as a whole, the analytical solution also offers a rigorous alternative to both (a) the approximate Whitham solution, for generating initial values, and (b) the rough volume balance techniques used to model the wave tip in numerical surge flow computations.
Resumo:
Injection drug use (involving the injection of illicit opiates) poses serious public health problems in many countries. Research has indicated that injection drug users are at higher risk for morbidity in the form of HIV/AIDS and Hepatitis B and C, and drug-related mortality, as well as increased criminal activity. Methadone maintenance treatment is the most prominent form of pharmacotherapy treatment for illicit opiate dependence in several countries, and its application varies internationally with respect to treatment regulations and delivery modes. In order to effectively treat those patients who have previously been resistant to methadone maintenance treatment, several countries have been studying and/or considering heroin-assisted treatment as a complementary form of opiate pharmacotherapy treatment. This paper provides an overview of the prevalence of injection drug use and the opiate dependence problem internationally, the current opiate dependence treatment landscape in several countries, and the status of ongoing or planned heroin-assisted treatment trials in Australia, Canada and certain European countries.
Resumo:
In this paper, genetic algorithm (GA) is applied to the optimum design of reinforced concrete liquid retaining structures, which comprise three discrete design variables, including slab thickness, reinforcement diameter and reinforcement spacing. GA, being a search technique based on the mechanics of natural genetics, couples a Darwinian survival-of-the-fittest principle with a random yet structured information exchange amongst a population of artificial chromosomes. As a first step, a penalty-based strategy is entailed to transform the constrained design problem into an unconstrained problem, which is appropriate for GA application. A numerical example is then used to demonstrate strength and capability of the GA in this domain problem. It is shown that, only after the exploration of a minute portion of the search space, near-optimal solutions are obtained at an extremely converging speed. The method can be extended to application of even more complex optimization problems in other domains.
Resumo:
As in the standard land assembly problem, a developer wants to buy two adjacent blocks of land belonging to two different owners. The value of the two blocks of land to the developer is greater than the sum of the individual values of the blocks for each owner. Unlike the land assembly literature, however, our focus is on the incentive that each lot owner has to delay the start of negotiations, rather than on the public goods nature of the problem. An incentive for delay exists, for example, when owners perceive that being last to sell will allow them to capture a larger share of the joint surplus from the development. We show that competition at point of sale can cause equilibrium delay, and that cooperation at point of sale will eliminate delay. This suggests that strategic delay is another source for the inefficient allocation of land, in addition to the public-good type externality pointed out by Grossman and Hart [Bell Journal of Economics 11 (1980) 42] and O'Flaherty [Regional Science and Urban Economics 24 (1994) 287]. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This paper is part of a large study to assess the adequacy of the use of multivariate statistical techniques in theses and dissertations of some higher education institutions in the area of marketing with theme of consumer behavior from 1997 to 2006. The regression and conjoint analysis are focused on in this paper, two techniques with great potential of use in marketing studies. The objective of this study was to analyze whether the employement of these techniques suits the needs of the research problem presented in as well as to evaluate the level of success in meeting their premisses. Overall, the results suggest the need for more involvement of researchers in the verification of all the theoretical precepts of application of the techniques classified in the category of investigation of dependence among variables.
Resumo:
The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.
Resumo:
In studies assessing the trends in coronary events, such as the World Health Organization (WHO) MONICA Project (multinational MONItoring of trends and determinants of CArdiovascular disease), the main emphasis has been on coronary deaths and non-fatal definite myocardial infarctions (MI). It is, however, possible that the proportion of milder MIs may be increasing because of improvements in treatment and reductions in levels of risk factors. We used the MI register data of the WHO MONICA Project to investigate several definitions for mild non-fatal MIs that would be applicable in various settings and could be used to assess trends in milder coronary events. Of 38 populations participating in the WHO MONICA MI register study, more than half registered a sufficiently wide spectrum of events that it was possible to identify subsets of milder cases. The event rates and case fatality rates of MI are clearly dependent on the spectrum of non-fatal MIs, which are included. On clinical grounds we propose that the original MONICA category ''non-fatal possible MI'' could bt:divided into two groups: ''non fatal probable MI'' and ''prolonged chest pain.'' Non-fatal probable MIs are cases, which in addition to ''typical symptoms'' have electrocardiogram (EGG) or enzyme changes suggesting cardiac ischemia, but not severe enough to fulfil the criteria for non-fatal definite MI In more than half of the MONICA Collaborating Centers, the registration of MI covers these milder events reasonably well. Proportions of non-fatal probable MIs vary less between populations than do proportions of non fatal possible MIs. Also rates of non-fatal probable MI are somewhat more highly correlated with rates of fatal events and non-fatal definite MI. These findings support the validity of the category of non-fatal probable MI. In each center the increase in event rates and the decrease in case-fatality due to the inclusion of non-fatal probable MI was lar er for women than men. For the WHO MONICA Project and other epidemiological studies the proposed category of non-fatal probable MIs can be used for assessing trends in rates of milder MI. Copyright (C) 1997 Elsevier Science Inc.
Resumo:
A G-design of order n is a pair (P,B) where P is the vertex set of the complete graph K-n and B is an edge-disjoint decomposition of K-n into copies of the simple graph G. Following design terminology, we call these copies ''blocks''. Here K-4 - e denotes the complete graph K-4 with one edge removed. It is well-known that a K-4 - e design of order n exists if and only if n = 0 or 1 (mod 5), n greater than or equal to 6. The intersection problem here asks for which k is it possible to find two K-4 - e designs (P,B-1) and (P,B-2) of order n, with \B-1 boolean AND B-2\ = k, that is, with precisely k common blocks. Here we completely solve this intersection problem for K-4 - e designs.
Resumo:
The classification rules of linear discriminant analysis are defined by the true mean vectors and the common covariance matrix of the populations from which the data come. Because these true parameters are generally unknown, they are commonly estimated by the sample mean vector and covariance matrix of the data in a training sample randomly drawn from each population. However, these sample statistics are notoriously susceptible to contamination by outliers, a problem compounded by the fact that the outliers may be invisible to conventional diagnostics. High-breakdown estimation is a procedure designed to remove this cause for concern by producing estimates that are immune to serious distortion by a minority of outliers, regardless of their severity. In this article we motivate and develop a high-breakdown criterion for linear discriminant analysis and give an algorithm for its implementation. The procedure is intended to supplement rather than replace the usual sample-moment methodology of discriminant analysis either by providing indications that the dataset is not seriously affected by outliers (supporting the usual analysis) or by identifying apparently aberrant points and giving resistant estimators that are not affected by them.