975 resultados para Selection Algorithms
Resumo:
En nuestro proyecto anterior aproximamos el cálculo de una integral definida con integrandos de grandes variaciones funcionales. Nuestra aproximación paraleliza el algoritmo de cómputo de un método adaptivo de cuadratura, basado en reglas de Newton-Cote. Los primeros resultados obtenidos fueron comunicados en distintos congresos nacionales e internacionales; ellos nos permintieron comenzar con una tipificación de las reglas de cuadratura existentes y una clasificación de algunas funciones utilizadas como funciones de prueba. Estas tareas de clasificación y tipificación no las hemos finalizado, por lo que pretendemos darle continuidad a fin de poder informar sobre la conveniencia o no de utilizar nuestra técnica. Para llevar adelante esta tarea se buscará una base de funciones de prueba y se ampliará el espectro de reglas de cuadraturas a utilizar. Además, nos proponemos re-estructurar el cálculo de algunas rutinas que intervienen en el cómputo de la mínima energía de una molécula. Este programa ya existe en su versión secuencial y está modelizado utilizando la aproximación LCAO. El mismo obtiene resultados exitosos en cuanto a precisión, comparado con otras publicaciones internacionales similares, pero requiere de un tiempo de cálculo significativamente alto. Nuestra propuesta es paralelizar el algoritmo mencionado abordándolo al menos en dos niveles: 1- decidir si conviene distribuir el cálculo de una integral entre varios procesadores o si será mejor distribuir distintas integrales entre diferentes procesadores. Debemos recordar que en los entornos de arquitecturas paralelas basadas en redes (típicamente redes de área local, LAN) el tiempo que ocupa el envío de mensajes entre los procesadores es muy significativo medido en cantidad de operaciones de cálculo que un procesador puede completar. 2- de ser necesario, paralelizar el cálculo de integrales dobles y/o triples. Para el desarrollo de nuestra propuesta se desarrollarán heurísticas para verificar y construir modelos en los casos mencionados tendientes a mejorar las rutinas de cálculo ya conocidas. A la vez que se testearán los algoritmos con casos de prueba. La metodología a utilizar es la habitual en Cálculo Numérico. Con cada propuesta se requiere: a) Implementar un algoritmo de cálculo tratando de lograr versiones superadoras de las ya existentes. b) Realizar los ejercicios de comparación con las rutinas existentes para confirmar o desechar una mejor perfomance numérica. c) Realizar estudios teóricos de error vinculados al método y a la implementación. Se conformó un equipo interdisciplinario integrado por investigadores tanto de Ciencias de la Computación como de Matemática. Metas a alcanzar Se espera obtener una caracterización de las reglas de cuadratura según su efectividad, con funciones de comportamiento oscilatorio y con decaimiento exponencial, y desarrollar implementaciones computacionales adecuadas, optimizadas y basadas en arquitecturas paralelas.
Resumo:
As digital imaging processing techniques become increasingly used in a broad range of consumer applications, the critical need to evaluate algorithm performance has become recognised by developers as an area of vital importance. With digital image processing algorithms now playing a greater role in security and protection applications, it is of crucial importance that we are able to empirically study their performance. Apart from the field of biometrics little emphasis has been put on algorithm performance evaluation until now and where evaluation has taken place, it has been carried out in a somewhat cumbersome and unsystematic fashion, without any standardised approach. This paper presents a comprehensive testing methodology and framework aimed towards automating the evaluation of image processing algorithms. Ultimately, the test framework aims to shorten the algorithm development life cycle by helping to identify algorithm performance problems quickly and more efficiently.
Resumo:
This research studies the phenomenon of national and corporate culture. National culture is the culture the members of a country share and corporate culture is a subculture which members of an organisation share (Schein, 1992). The objective of this research is to reveal if the employees within equivalent Irish and American companies share the same corporate and national culture and to ascertain if, within each company, there is a link between national culture and corporate culture. The object of this study is achieved by replicating research which was conducted by Shing (1997) in Taiwan. Hypotheses and analytical tools developed by Shing are employed in the current study to allow comparison of results between Shing’s study and the current study. The methodology used, called for the measurement and comparison of national and corporate culture in two equivalent companies within the same industry. The two companies involved in this study are both located in Ireland and are of American and Irish origin. A sample of three hundred was selected and the response rate was 54%. The findings from this research are: (1) The two companies involved had different corporate cultures, (2) They had the same national culture, (3) There was no link between national culture and corporate culture within either company, (4) The findings were not similar to those of Shing (1997). The implication of these findings is that national and corporate culture are separate phenomena therefore corporate culture is not a response to national culture. The results of this research are not reflected in the finding’s of Shing (1997), therefore they are context specific. The core recommendation for management is that, corporate culture should take account of national culture. This is because although employees recognise the espoused values of corporate culture (Schein, 1992), they are at the same time influenced by a much stronger force, their national culture.
Resumo:
Magdeburg, Univ., Fak. für Naturwiss., Diss., 2012
Resumo:
Magdeburg, Univ., Fak. für Mathematik, Habil.-Schr., 2006
Resumo:
Background: Several researchers seek methods for the selection of homogeneous groups of animals in experimental studies, a fact justified because homogeneity is an indispensable prerequisite for casualization of treatments. The lack of robust methods that comply with statistical and biological principles is the reason why researchers use empirical or subjective methods, influencing their results. Objective: To develop a multivariate statistical model for the selection of a homogeneous group of animals for experimental research and to elaborate a computational package to use it. Methods: The set of echocardiographic data of 115 male Wistar rats with supravalvular aortic stenosis (AoS) was used as an example of model development. Initially, the data were standardized, and became dimensionless. Then, the variance matrix of the set was submitted to principal components analysis (PCA), aiming at reducing the parametric space and at retaining the relevant variability. That technique established a new Cartesian system into which the animals were allocated, and finally the confidence region (ellipsoid) was built for the profile of the animals’ homogeneous responses. The animals located inside the ellipsoid were considered as belonging to the homogeneous batch; those outside the ellipsoid were considered spurious. Results: The PCA established eight descriptive axes that represented the accumulated variance of the data set in 88.71%. The allocation of the animals in the new system and the construction of the confidence region revealed six spurious animals as compared to the homogeneous batch of 109 animals. Conclusion: The biometric criterion presented proved to be effective, because it considers the animal as a whole, analyzing jointly all parameters measured, in addition to having a small discard rate.
Resumo:
ERP, auditory virtual reality, dichotic listening, selective auditory attention, cocktail-party phenomenon, HRTF
Resumo:
2
Resumo:
1
Resumo:
AbstractBackground:Guidelines recommend that in suspected stable coronary artery disease (CAD), a clinical (non-invasive) evaluation should be performed before coronary angiography.Objective:We assessed the efficacy of patient selection for coronary angiography in suspected stable CAD.Methods:We prospectively selected consecutive patients without known CAD, referred to a high-volume tertiary center. Demographic characteristics, risk factors, symptoms and non-invasive test results were correlated to the presence of obstructive CAD. We estimated the CAD probability based on available clinical data and the incremental diagnostic value of previous non-invasive tests.Results:A total of 830 patients were included; median age was 61 years, 49.3% were males, 81% had hypertension and 35.5% were diabetics. Non-invasive tests were performed in 64.8% of the patients. At coronary angiography, 23.8% of the patients had obstructive CAD. The independent predictors for obstructive CAD were: male gender (odds ratio [OR], 3.95; confidence interval [CI] 95%, 2.70 - 5.77), age (OR for 5 years increment, 1.15; CI 95%, 1.06 - 1.26), diabetes (OR, 2.01; CI 95%, 1.40 - 2.90), dyslipidemia (OR, 2.02; CI 95%, 1.32 - 3.07), typical angina (OR, 2.92; CI 95%, 1.77 - 4.83) and previous non-invasive test (OR 1.54; CI 95% 1.05 - 2.27).Conclusions:In this study, less than a quarter of the patients referred for coronary angiography with suspected CAD had the diagnosis confirmed. A better clinical and non-invasive assessment is necessary, to improve the efficacy of patient selection for coronary angiography.
Resumo:
1