943 resultados para Comparison with optical model calculations using the Sao Paulo potential and other data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A simplified CFD wake model based on the actuator-disk concept is used to simulate the wind turbine, represented by an actuator disk upon which a distribution of forces, defined as axial momentum sources, are applied on the incoming flow. The rotor is supposed to be uniformly loaded, with the exerted forces as a function of the incident wind speed, the thrust coefficient and the rotor diameter. The model is validated through experimental measurements downwind of a wind turbine in terms of wind speed deficit. Validation on turbulence intensity will also be made in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Over the last few years, the Pennsylvania State University (PSU) under the sponsorship of the US Nuclear Regulatory Commission (NRC) has prepared, organized, conducted, and summarized two international benchmarks based on the NUPEC datathe OECD/NRC Full-Size Fine-Mesh Bundle Test (BFBT) Benchmark and the OECD/NRC PWR Sub-Channel and Bundle Test (PSBT) Benchmark. The benchmarks’ activities have been conducted in cooperation with the Nuclear Energy Agency/Organization for Economic Co-operation and Development (NEA/OECD) and the Japan Nuclear Energy Safety (JNES) Organization. This paper presents an application of the joint Penn State University/Technical University of Madrid (UPM) version of the well-known sub-channel code COBRA-TF (Coolant Boiling in Rod Array-Two Fluid), namely, CTF, to the steady state critical power and departure from nucleate boiling (DNB) exercises of the OECD/NRC BFBT and PSBT benchmarks. The goal is two-fold: firstly, to assess these models and to examine their strengths and weaknesses; and secondly, to identify the areas for improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The continuous improvement of management and assessment processes for curricular external internships has led a group of university teachers specialised in this area to develop a mixed model of measurement that combines the verification of skill acquisition by those students choosing external internships with the satisfaction of the parties involved in that process. They included academics, educational tutors of companies and organisations and administration and services personnel in the latter category. The experience, developed within University of Alicante, has been carried out in the degrees of Business Administration and Management, Business Studies, Economics, Advertising and Public Relations, Sociology and Social Work, all part of the Faculty of Economics and Business. By designing and managing closed standardised interviews and other research tools, validated outside the centre, a system of continuous improvement and quality assurance has been created, clearly contributing to the gradual increase in the number of students with internships in this Faculty, as well as to the improvement in satisfaction, efficiency and efficacy indicators at a global level. As this experience of educational innovation has shown, the acquisition of curricular knowledge, skills, abilities and competences by the students is directly correlated with the satisfaction of those parties involved in a process that takes the student beyond the physical borders of a university campus. Ensuring the latter is a task made easier by the implementation of a mixed assessment method, combining continuous and final assessment, and characterised by its rigorousness and simple management. This report presents that model, subject in turn to a persistent and continuous control, a model all parties involved in the external internships are taking part of. Its short-term results imply an increase, estimated at 15% for the last course, in the number of students choosing curricular internships and, for the medium and long-term, a major interweaving between the academic world and its social and productive environment, both in the business and institutional areas. The potentiality of this assessment model does not lie only in the quality of its measurement tools, but also in the effects from its use in the various groups and in the actions that are carried out as a result of its implementation and which, without any doubt and as it is shown below, are the real guarantee of a continuous improvement.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Appetitive Motivation Scale (Jackson & Smillie, 2004) is a new trait conceptualisation of Gray's (I 970 199 1) Behavioural Activation System. In this experiment we explore relationships that the Appetitive Motivation Scale and other measures of Gray's model have with Approach and Active Avoidance responses. Using a sample of 144 undergraduate students, both Appetitive Motivation and Sensitivity to Reward (from the Sensitivity to Punishment and Sensitivity to Reward Questionnaire, SPSRQ; Torrubia, Avila, Molto, & Ceseras, 2001), were found to be significant predictors of Approach and Active Avoidance response latency. This confirms previous experimental validations of the SPSRQ (e.g., Avila, 2001) and provides the first experimental evidence for the validity of the Appetitive Motivation scale. Consistent with interactive views of Gray's model (e.g., Corr, 2001), high SPSRQ Sensitivity to Punishment diminished the relationship between Sensitivity to Reward and our BAS criteria. Measures of BIS did not however interact in this way with the appetitive motivation scale. A surprising result was the failure for any of Carver and White's (1994) BAS scales to correlate with RST criteria. Implications of these findings and potential future directions are discussed. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The particle-based lattice solid model developed to study the physics of rocks and the nonlinear dynamics of earthquakes is refined by incorporating intrinsic friction between particles. The model provides a means for studying the causes of seismic wave attenuation, as well as frictional heat generation, fault zone evolution, and localisation phenomena. A modified velocity-Verlat scheme that allows friction to be precisely modelled is developed. This is a difficult computational problem given that a discontinuity must be accurately simulated by the numerical approach (i.e., the transition from static to dynamical frictional behaviour). This is achieved using a half time step integration scheme. At each half time step, a nonlinear system is solved to compute the static frictional forces and states of touching particle-pairs. Improved efficiency is achieved by adaptively adjusting the time step increment, depending on the particle velocities in the system. The total energy is calculated and verified to remain constant to a high precision during simulations. Numerical experiments show that the model can be applied to the study of earthquake dynamics, the stick-slip instability, heat generation, and fault zone evolution. Such experiments may lead to a conclusive resolution of the heat flow paradox and improved understanding of earthquake precursory phenomena and dynamics. (C) 1999 Academic Press.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: To introduce a new technique for co-registration of Magnetoencephalography (MEG) with magnetic resonance imaging (MRI). We compare the accuracy of a new bite-bar with fixed fiducials to a previous technique whereby fiducial coils were attached proximal to landmarks on the skull. Methods: A bite-bar with fixed fiducial coils is used to determine the position of the head in the MEG co-ordinate system. Co-registration is performed by a surface-matching technique. The advantage of fixing the coils is that the co-ordinate system is not based upon arbitrary and operator dependent fiducial points that are attached to landmarks (e.g. nasion and the preauricular points), but rather on those that are permanently fixed in relation to the skull. Results: As a consequence of minimizing coil movement during digitization, errors in localization of the coils are significantly reduced, as shown by a randomization test. Displacement of the bite-bar caused by removal and repositioning between MEG recordings is minimal (∼0.5 mm), and dipole localization accuracy of a somatosensory mapping paradigm shows a repeatability of ∼5 mm. The overall accuracy of the new procedure is greatly improved compared to the previous technique. Conclusions: The test-retest reliability and accuracy of target localization with the new design is superior to techniques that incorporate anatomical-based fiducial points or coils placed on the circumference of the head. © 2003 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mammalian binaural cue of interaural time difference (ITD) and cross-correlation have long been used to determine the point of origin of a sound source. The ITD can be defined as the different points in time at which a sound from a single location arrives at each individual ear [1]. From this time difference, the brain can calculate the angle of the sound source in relation to the head [2]. Cross-correlation compares the similarity of each channel of a binaural waveform producing the time lag or offset required for both channels to be in phase with one another. This offset corresponds to the maximum value produced by the cross-correlation function and can be used to determine the ITD and thus the azimuthal angle θ of the original sound source. However, in indoor environments, cross-correlation has been known to have problems with both sound reflections and reverberations. Additionally, cross-correlation has difficulties with localising short-term complex noises when they occur during a longer duration waveform, i.e. in the presence of background noise. The crosscorrelation algorithm processes the entire waveform and the short-term complex noise can be ignored. This paper presents a technique using thresholding which enables higher-localisation abilities for short-term complex sounds in the midst of background noise. To determine the success of this thresholding technique, twenty-five sounds were recorded in a dynamic and echoic environment. The twenty-five sounds consist of hand-claps, finger-clicks and speech. The proposed technique was compared to the regular cross-correlation function for the same waveforms, and an average of the azimuthal angles determined for each individual sample. The sound localisation ability for all twenty-five sound samples is as follows: average of the sampled angles using cross-correlation: 44%; cross-correlation technique with thresholding: 84%. From these results, it is clear that this proposed technique is very successful for the localisation of short-term complex sounds in the midst of background noise and in a dynamic and echoic indoor environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a crack propagation algorithm which is independent of particular constitutive laws and specific element technology. It consists of a localization limiter in the form of the screened Poisson equation with local mesh refinement. This combination allows the cap- turing of strain localization with good resolution, even in the absence of a sufficiently fine initial mesh. In addition, crack paths are implicitly defined from the localized region, cir- cumventing the need for a specific direction criterion. Observed phenomena such as mul- tiple crack growth and shielding emerge naturally from the algorithm. In contrast with alternative regularization algorithms, curved cracks are correctly represented. A staggered scheme for standard equilibrium and screened equations is used. Element subdivision is based on edge split operations using a given constitutive quantity (either damage or void fraction). To assess the robustness and accuracy of this algorithm, we use both quasi-brittle benchmarks and ductile tests.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent transformations taking place in the city of Sao Paulo show how the global urban dimension goes hand in hand with local deprivation and segregation. Both globalisation trends and segregation dynamics orientate a social and spatial process of change which has as the main result the dissolution of the urban condition in Sao Paulo. The metropolisation dynamics currently at work in South American cities can be understood, taking into consideration evidences and facts coming from the analysis of Sao Paulo, as a transition resulting from the globalisation process which takes place in contemporary cities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial neural networks have been used to analyze a number of engineering problems, including settlement caused by different tunneling methods in various types of ground mass. This paper focuses on settlement over shotcrete- supported tunnels on Sao Paulo subway line 2 (West Extension) that were excavated in Tertiary sediments using the sequential excavation method. The adjusted network is a good tool for predicting settlement above new tunnels to be excavated in similar conditions. The influence of network training parameters on the quality of results is also discussed. (C) 2007 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Dementia is now a major public health issue in low- and middle-income countries, and strategies for primary prevention are needed. This study aimed to estimate the proportion of cases of dementia attributable to illiteracy, non-skilled occupation and low income, which are common, potentially modifiable social adversities that occur along the lifespan in low- and middle-income countries. Methods: This report is based on data from the Sao Paulo Ageing & Health Study (SPAH) study (N = 2003). All individuals aged 65 years and older residing within pre-defined socially deprived areas of the city of Sao Paulo, Brazil, were included. The outcome of interest was prevalent dementia. Indicators of socioeconomic position (SEP) were literacy (distal indicator), highest occupational attainment (intermediate indicator), and monthly personal income (proximal indicator). We estimated the proportion of prevalent dementia attributable to each SEP indicator (illiteracy, non-skilled occupations and low income) by calculating their population attributable fractions (PAF). Results: Dementia was more prevalent amongst participants who were illiterate, had non-skilled occupations and lower income. Illiteracy, poor occupational achievement and low income accounted for 22.0%, 38.5% and 38.5% of the cases of dementia, respectively. There was a cumulative effect of socioeconomic adversities during the lifespan, and nearly 50% of the prevalence of dementia could be potentially attributed to the combination of two or three of the socioeconomic adversities investigated. Conclusions: Public policies aimed at improving education, occupational skills and income could potentially have a role in primary prevention of dementia. Governments should address this issue in a purposeful and systematic way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Eukaryotic phenotypic diversity arises from multitasking of a core proteome of limited size. Multitasking is routine in computers, as well as in other sophisticated information systems, and requires multiple inputs and outputs to control and integrate network activity. Higher eukaryotes have a mosaic gene structure with a dual output, mRNA (protein-coding) sequences and introns, which are released from the pre-mRNA by posttranscriptional processing. Introns have been enormously successful as a class of sequences and comprise up to 95% of the primary transcripts of protein-coding genes in mammals. In addition, many other transcripts (perhaps more than half) do not encode proteins at all, but appear both to be developmentally regulated and to have genetic function. We suggest that these RNAs (eRNAs) have evolved to function as endogenous network control molecules which enable direct gene-gene communication and multitasking of eukaryotic genomes. Analysis of a range of complex genetic phenomena in which RNA is involved or implicated, including co-suppression, transgene silencing, RNA interference, imprinting, methylation, and transvection, suggests that a higher-order regulatory system based on RNA signals operates in the higher eukaryotes and involves chromatin remodeling as well as other RNA-DNA, RNA-RNA, and RNA-protein interactions. The evolution of densely connected gene networks would be expected to result in a relatively stable core proteome due to the multiple reuse of components, implying,that cellular differentiation and phenotypic variation in the higher eukaryotes results primarily from variation in the control architecture. Thus, network integration and multitasking using trans-acting RNA molecules produced in parallel with protein-coding sequences may underpin both the evolution of developmentally sophisticated multicellular organisms and the rapid expansion of phenotypic complexity into uncontested environments such as those initiated in the Cambrian radiation and those seen after major extinction events.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There have been few replicated examples of genotype x environment interaction effects on behavioral variation or risk of psychiatric disorder. We review some of the factors that have made detection of genotype x environment interaction effects difficult, and show how genotype x shared environment interaction (GxSE) effects are commonly confounded with genetic parameters in data from twin pairs reared together. Historic data on twin pairs reared apart can in principle be used to estimate such GxSE effects, but have rarely been used for this purpose. We illustrate this using previously published data from the Swedish Adoption Twin Study of Aging (SATSA), which suggest that GxSE effects could account for as much as 25% of the total variance in risk of becoming a regular smoker. Since few separated twin pairs will be available for study in the future, we also consider methods for modifying variance components linkage analysis to allow for environmental interactions with linked loci.