912 resultados para Numerical Algorithms and Problems
Resumo:
Based on our previous work, we investigate here the effects on the wind and magnetospheric structures of weak-lined T Tauri stars due to a misalignment between the axis of rotation of the star and its magnetic dipole moment vector. In such a configuration, the system loses the axisymmetry presented in the aligned case, requiring a fully three-dimensional (3D) approach. We perform 3D numerical magnetohydrodynamic simulations of stellar winds and study the effects caused by different model parameters, namely the misalignment angle theta(t), the stellar period of rotation, the plasma-beta, and the heating index.. Our simulations take into account the interplay between the wind and the stellar magnetic field during the time evolution. The system reaches a periodic behavior with the same rotational period of the star. We show that the magnetic field lines present an oscillatory pattern. Furthermore, we obtain that by increasing theta(t), the wind velocity increases, especially in the case of strong magnetic field and relatively rapid stellar rotation. Our 3D, time-dependent wind models allow us to study the interaction of a magnetized wind with a magnetized extrasolar planet. Such interaction gives rise to reconnection, generating electrons that propagate along the planet`s magnetic field lines and produce electron cyclotron radiation at radio wavelengths. The power released in the interaction depends on the planet`s magnetic field intensity, its orbital radius, and on the stellar wind local characteristics. We find that a close-in Jupiter-like planet orbiting at 0.05 AU presents a radio power that is similar to 5 orders of magnitude larger than the one observed in Jupiter, which suggests that the stellar wind from a young star has the potential to generate strong planetary radio emission that could be detected in the near future with LOFAR. This radio power varies according to the phase of rotation of the star. For three selected simulations, we find a variation of the radio power of a factor 1.3-3.7, depending on theta(t). Moreover, we extend the investigation done in Vidotto et al. and analyze whether winds from misaligned stellar magnetospheres could cause a significant effect on planetary migration. Compared to the aligned case, we show that the timescale tau(w) for an appreciable radial motion of the planet is shorter for larger misalignment angles. While for the aligned case tau(w) similar or equal to 100 Myr, for a stellar magnetosphere tilted by theta(t) = 30 degrees, tau(w) ranges from similar to 40 to 70 Myr for a planet located at a radius of 0.05 AU. Further reduction on tau(w) might occur for even larger misalignment angles and/or different wind parameters.
Resumo:
Neotropical forests have brought forth a large proportion of the world`s terrestrial biodiversity, but the underlying evolutionary mechanisms and their timing require further elucidation. Despite insights gained from phylogenetic studies, uncertainties about molecular clock rates have hindered efforts to determine the timing of diversification processes. Moreover, most molecular research has been detached from the extensive body of data on Neotropical geology and paleogeography. We here examine phylogenetic relationships and the timing of speciation events in a Neotropical flycatcher genus (Myiopagis) by using calibrations from modern geologic data in conjunction with a number of recently developed DNA sequence dating algorithms and by comparing these estimates with those based on a range of previously proposed molecular clock rates. We present a well-supported hypothesis of systematic relationships within the genus. Our age estimates of Myiopagis speciation events based on paleogeographic data are in close agreement with nodal ages derived from a ""traditional"" avian mitochondrial 2%/My clock, while contradicting other clock rates. Our comparative approach corroborates the consistency of the traditional avian mitochondrial clock rate of 2%/My for tyrant-flycatchers. Nevertheless, our results argue against the indiscriminate use of molecular clock rates in evolutionary research and advocate the verification of the appropriateness of the traditional clock rate by means of independent calibrations in individual studies. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
A method for linearly constrained optimization which modifies and generalizes recent box-constraint optimization algorithms is introduced. The new algorithm is based on a relaxed form of Spectral Projected Gradient iterations. Intercalated with these projected steps, internal iterations restricted to faces of the polytope are performed, which enhance the efficiency of the algorithm. Convergence proofs are given and numerical experiments are included and commented. Software supporting this paper is available through the Tango Project web page: http://www.ime.usp.br/similar to egbirgin/tango/.
Resumo:
Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.
Resumo:
This paper presents the formulation of a combinatorial optimization problem with the following characteristics: (i) the search space is the power set of a finite set structured as a Boolean lattice; (ii) the cost function forms a U-shaped curve when applied to any lattice chain. This formulation applies for feature selection in the context of pattern recognition. The known approaches for this problem are branch-and-bound algorithms and heuristics that explore partially the search space. Branch-and-bound algorithms are equivalent to the full search, while heuristics are not. This paper presents a branch-and-bound algorithm that differs from the others known by exploring the lattice structure and the U-shaped chain curves of the search space. The main contribution of this paper is the architecture of this algorithm that is based on the representation and exploration of the search space by new lattice properties proven here. Several experiments, with well known public data, indicate the superiority of the proposed method to the sequential floating forward selection (SFFS), which is a popular heuristic that gives good results in very short computational time. In all experiments, the proposed method got better or equal results in similar or even smaller computational time. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Colour segmentation is the most commonly used method in road signs detection. Road sign contains several basic colours such as red, yellow, blue and white which depends on countries.The objective of this thesis is to do an evaluation of the four colour segmentation algorithms. Dynamic Threshold Algorithm, A Modification of de la Escalera’s Algorithm, the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm. The processing time and segmentation success rate as criteria are used to compare the performance of the four algorithms. And red colour is selected as the target colour to complete the comparison. All the testing images are selected from the Traffic Signs Database of Dalarna University [1] randomly according to the category. These road sign images are taken from a digital camera mounted in a moving car in Sweden.Experiments show that the Fuzzy Colour Segmentation Algorithm and Shadow and Highlight Invariant Algorithm are more accurate and stable to detect red colour of road signs. And the method could also be used in other colours analysis research. The yellow colour which is chosen to evaluate the performance of the four algorithms can reference Master Thesis of Yumei Liu.
Resumo:
ABSTRACTThe general aim of this thesis was to investigate behavioral change communication at nurse-led chronic obstructive pulmonary disease (COPD) clinics in primary health care, focusing on communication in self-management and smoking cessation for patients with COPD.Designs: Observational, prospective observational and experimental designs were used.Methods: To explore and describe the structure and content of self-management education and smoking cessation communication, consultations between patients (n=30) and nurses (n=7) were videotaped and analyzed with three instruments: Consulting Map (CM), the Motivational Interviewing Treatment Integrity (MITI) scale and the Client Language Assessment in Motivational Interviewing (CLAMI). To examine the effects of structured self-management education, patients with COPD (n=52) were randomized in an intervention and a control group. Patients’ quality of life (QoL), knowledge about COPD and smoking cessation were examined with a questionnaire on knowledge about COPD and smoking habits and with St. George’s Respiratory Questionnaire, addressing QoL. Results: The findings from the videotaped consultations showed that communication about the reasons for consultation mainly concerned medical and physical problems and (to a certain extent) patients´ perceptions. Two consultations ended with shared understanding, but none of the patients received an individual treatment-plan. In the smoking cessation communication the nurses did only to a small extent evoke patients’ reasons for change, fostered collaboration and supported patients’ autonomy. The nurses provided a lot of information (42%), asked closed (21%) rather than open questions (3%), made simpler (14%) rather than complex (2%) reflections and used MI non-adherent (16%) rather than MI-adherent (5%) behavior. Most of the patients’ utterances in the communication were neutral either toward or away from smoking cessation (59%), utterances about reason (desire, ability and need) were 40%, taking steps 1% and commitment to stop smoking 0%. The number of patients who stopped smoking, and patients’ knowledge about the disease and their QoL, was increased by structured self-management education and smoking cessation in collaboration between the patient, nurse and physician and, when necessary, a physiotherapist, a dietician, an occupational therapist and/or a medical social worker.Conclusion The communication at nurse-led COPD clinics rarely involved the patients in shared understanding and responsibility and concerned patients’ fears, worries and problems only to a limited extent. The results also showed that nurses had difficulties in attaining proficiency in behavioral change communication. Structured self-management education showed positive effects on patients’ perceived QoL, on the number of patients who quit smoking and on patients’ knowledge about COPD.
Resumo:
Research objectives Poker and responsible gambling both entail the use of the executive functions (EF), which are higher-level cognitive abilities. The main objective of this work was to assess if online poker players of different ability show different performances in their EF and if so, which functions are the most discriminating ones. The secondary objective was to assess if the EF performance can predict the quality of gambling, according to the Gambling Related Cognition Scale (GRCS), the South Oaks Gambling Screen (SOGS) and the Problem Gambling Severity Index (PGSI). Sample and methods The study design consisted of two stages: 46 Italian active players (41m, 5f; age 32±7,1ys; education 14,8±3ys) fulfilled the PGSI in a secure IT web system and uploaded their own hand history files, which were anonymized and then evaluated by two poker experts. 36 of these players (31m, 5f; age 33±7,3ys; education 15±3ys) accepted to take part in the second stage: the administration of an extensive neuropsychological test battery by a blinded trained professional. To answer the main research question we collected all final and intermediate scores of the EF tests on each player together with the scoring on the playing ability. To answer the secondary research question, we referred to GRCS, PGSI and SOGS scores. We determined which variables that are good predictors of the playing ability score using statistical techniques able to deal with many regressors and few observations (LASSO, best subset algorithms and CART). In this context information criteria and cross-validation errors play a key role for the selection of the relevant regressors, while significance testing and goodness-of-fit measures can lead to wrong conclusions. Preliminary findings We found significant predictors of the poker ability score in various tests. In particular, there are good predictors 1) in some Wisconsin Card Sorting Test items that measure flexibility in choosing strategy of problem-solving, strategic planning, modulating impulsive responding, goal setting and self-monitoring, 2) in those Cognitive Estimates Test variables related to deductive reasoning, problem solving, development of an appropriate strategy and self-monitoring, 3) in the Emotional Quotient Inventory Short (EQ-i:S) Stress Management score, composed by the Stress Tolerance and Impulse Control scores, and in the Interpersonal score (Empathy, Social Responsibility, Interpersonal Relationship). As for the quality of gambling, some EQ-i:S scales scores provide the best predictors: General Mood for the PGSI; Intrapersonal (Self-Regard; Emotional Self-Awareness, Assertiveness, Independence, Self-Actualization) and Adaptability (Reality Testing, Flexibility, Problem Solving) for the SOGS, Adaptability for the GRCS. Implications for the field Through PokerMapper we gathered knowledge and evaluated the feasibility of the construction of short tasks/card games in online poker environments for profiling users’ executive functions. These card games will be part of an IT system able to dynamically profile EF and provide players with a feedback on their expected performance and ability to gamble responsibly in that particular moment. The implementation of such system in existing gambling platforms could lead to an effective proactive tool for supporting responsible gambling.
Resumo:
While the simulation of flood risks originating from the overtopping of river banks is well covered within continuously evaluated programs to improve flood protection measures, flash flooding is not. Flash floods are triggered by short, local thunderstorm cells with high precipitation intensities. Small catchments have short response times and flow paths and convective thunder cells may result in potential flooding of endangered settlements. Assessing local flooding and pathways of flood requires a detailed hydraulic simulation of the surface runoff. Hydrological models usually do not incorporate surface runoff at this detailedness but rather empirical equations are applied for runoff detention. In return 2D hydrodynamic models usually do not allow distributed rainfall as input nor are any types of soil/surface interaction implemented as in hydrological models. Considering several cases of local flash flooding during the last years the issue emerged for practical reasons but as well as research topics to closing the model gap between distributed rainfall and distributed runoff formation. Therefore, a 2D hydrodynamic model, depth-averaged flow equations using the finite volume discretization, was extended to accept direct rainfall enabling to simulate the associated runoff formation. The model itself is used as numerical engine, rainfall is introduced via the modification of waterlevels at fixed time intervals. The paper not only deals with the general application of the software, but intends to test the numerical stability and reliability of simulation results. The performed tests are made using different artificial as well as measured rainfall series as input. Key parameters of the simulation such as losses, roughness or time intervals for water level manipulations are tested regarding their impact on the stability.
Resumo:
O objetivo deste trabalho é o dimensionamento de pilares esbeltos de concreto armado, sob cargas de curta e longa duração, baseado numa análise realística das deformações do mesmo. Apresenta-se três algoritmos numéricos para a obtencão das relações momento fletor-esforço normal-curvatura de uma seção arbitrária de concreto armado, sob flexo-compressão normal. Inclue-se as deformações específicas de fluência e retração do concreto na análise, através de uma alteração nas referidas relações. Apresenta-se alguns critérios de normas, relativos ao dimensionamento de pilares esbeltos de concreto armado e uma comparação dos mesmos, entre si e com o algoritmo numérico desenvolvido. Considerações da NB-1/78 relativas ao projeto de pilares são analisadas, verificando o nivel da precisão obtida. Um procedimento simplificado para a inclusão da fluência do concreto no dimensionamento, proposto pelo CEB, é testado e uma solução para pilares de concreto armado com engastamento elástico simétrico é apresentada, para verificar o nível: do erro cometido ao se estender o conceito de comprimento de flambagem a pilares de concreto armado. Uma série de exemplos experimentais são apresentados, onde a solução numérica para o dimensionamento tem sua precisão verificada. Diversas tabelas foram desenvolvidas para o dimensionamento de pilares esbeltos com secão transversal retangular e armadura simétrica. Todo o estudo é restrito ao caso de flexo-compressão normal.
Resumo:
Neste trabalho, estendemos, de forma analítica, a formulação LTSN à problemas de transporte unidimensionais sem simetria azimutal. Para este problema, também apresentamos a solução com dependência contínua na variável angular, a partir da qual é estabelecido um método iterativo de solução da equação de transporte unidimensional. Também discutimos como a formulação LTSN é aplicada na resolução de problemas de transporte unidimensionais dependentes do tempo, tanto de forma aproximada pela inversão numérica do fluxo transformado na variável tempo, bem como analiticamente, pela aplicação do método LTSNnas equações nodais. Simulações numéricas e comparações com resultados disponíveis na literatura são apresentadas.
Resumo:
The number of research papers available today is growing at a staggering rate, generating a huge amount of information that people cannot keep up with. According to a tendency indicated by the United States’ National Science Foundation, more than 10 million new papers will be published in the next 20 years. Because most of these papers will be available on the Web, this research focus on exploring issues on recommending research papers to users, in order to directly lead users to papers of their interest. Recommender systems are used to recommend items to users among a huge stream of available items, according to users’ interests. This research focuses on the two most prevalent techniques to date, namely Content-Based Filtering and Collaborative Filtering. The first explores the text of the paper itself, recommending items similar in content to the ones the user has rated in the past. The second explores the citation web existing among papers. As these two techniques have complementary advantages, we explored hybrid approaches to recommending research papers. We created standalone and hybrid versions of algorithms and evaluated them through both offline experiments on a database of 102,295 papers, and an online experiment with 110 users. Our results show that the two techniques can be successfully combined to recommend papers. The coverage is also increased at the level of 100% in the hybrid algorithms. In addition, we found that different algorithms are more suitable for recommending different kinds of papers. Finally, we verified that users’ research experience influences the way users perceive recommendations. In parallel, we found that there are no significant differences in recommending papers for users from different countries. However, our results showed that users’ interacting with a research paper Recommender Systems are much happier when the interface is presented in the user’s native language, regardless the language that the papers are written. Therefore, an interface should be tailored to the user’s mother language.