901 resultados para Degree of freedom
Resumo:
Based on theoretical arguments we propose a possible route for controlling the band-gap in the promising photovoltaic material CdIn2S4. Our ab initio calculations show that the experimental degree of inversion in this spinel (fraction of tetrahedral sites occupied by In) corresponds approximately to the equilibrium value given by the minimum of the theoretical inversion free energy at a typical synthesis temperature. Modification of this temperature, or of the cooling rate after synthesis, is then expected to change the inversion degree, which in turn sensitively tunes the electronic band-gap of the solid, as shown here by Heyd-Scuseria-Ernzerhof screened hybrid functional calculations.
Resumo:
We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.
Resumo:
We study the approximation of harmonic functions by means of harmonic polynomials in two-dimensional, bounded, star-shaped domains. Assuming that the functions possess analytic extensions to a delta-neighbourhood of the domain, we prove exponential convergence of the approximation error with respect to the degree of the approximating harmonic polynomial. All the constants appearing in the bounds are explicit and depend only on the shape-regularity of the domain and on delta. We apply the obtained estimates to show exponential convergence with rate O(exp(−b square root N)), N being the number of degrees of freedom and b>0, of a hp-dGFEM discretisation of the Laplace equation based on piecewise harmonic polynomials. This result is an improvement over the classical rate O(exp(−b cubic root N )), and is due to the use of harmonic polynomial spaces, as opposed to complete polynomial spaces.
Social equality in the number of choice options is represented in the ventromedial prefrontal cortex
Resumo:
A distinct aspect of the sense of fairness in humans is that we care not only about equality in material rewards but also about equality in non-material values. One such value is the opportunity to choose freely among many options, often regarded as a fundamental right to economic freedom. In modern developed societies, equal opportunities in work, living, and lifestyle are enforced by anti-discrimination laws. Despite the widespread endorsement of equal opportunity, no studies have explored how people assign value to it. We used functional magnetic resonance imaging to identify the neural substrates for subjective valuation of equality in choice opportunity. Participants performed a two-person choice task in which the number of choices available was varied across trials independently of choice outcomes. By using this procedure, we manipulated the degree of equality in choice opportunity between players and dissociated it from the value of reward outcomes and their equality. We found that activation in the ventromedial prefrontal cortex tracked the degree to which the number of options between the two players was equal. In contrast, activation in the ventral striatum tracked the number of options available to participants themselves but not the equality between players. Our results demonstrate that the vmPFC, a key brain region previously implicated in the processing of social values, is also involved in valuation of equality in choice opportunity between individuals. These findings may provide valuable insight into the human ability to value equal opportunity, a characteristic long emphasized in politics, economics, and philosophy.
Resumo:
4-Dimensional Variational Data Assimilation (4DVAR) assimilates observations through the minimisation of a least-squares objective function, which is constrained by the model flow. We refer to 4DVAR as strong-constraint 4DVAR (sc4DVAR) in this thesis as it assumes the model is perfect. Relaxing this assumption gives rise to weak-constraint 4DVAR (wc4DVAR), leading to a different minimisation problem with more degrees of freedom. We consider two wc4DVAR formulations in this thesis, the model error formulation and state estimation formulation. The 4DVAR objective function is traditionally solved using gradient-based iterative methods. The principle method used in Numerical Weather Prediction today is the Gauss-Newton approach. This method introduces a linearised `inner-loop' objective function, which upon convergence, updates the solution of the non-linear `outer-loop' objective function. This requires many evaluations of the objective function and its gradient, which emphasises the importance of the Hessian. The eigenvalues and eigenvectors of the Hessian provide insight into the degree of convexity of the objective function, while also indicating the difficulty one may encounter while iterative solving 4DVAR. The condition number of the Hessian is an appropriate measure for the sensitivity of the problem to input data. The condition number can also indicate the rate of convergence and solution accuracy of the minimisation algorithm. This thesis investigates the sensitivity of the solution process minimising both wc4DVAR objective functions to the internal assimilation parameters composing the problem. We gain insight into these sensitivities by bounding the condition number of the Hessians of both objective functions. We also precondition the model error objective function and show improved convergence. We show that both formulations' sensitivities are related to error variance balance, assimilation window length and correlation length-scales using the bounds. We further demonstrate this through numerical experiments on the condition number and data assimilation experiments using linear and non-linear chaotic toy models.
Resumo:
By virtue of the volume and nature of their attributions, including secondary school as well as problem-areas such as security and traffic, the Brazilian states are the ultimate responsible entities for young people. This study argues in favour of granting greater freedom for the states to define their own public policy parameters to deal with local features and to increase the degree of learning about such actions at the national level. In empirical terms, the study assesses the impacts of new laws, such as the new traffic code (from the joint work with Leandro Kume, EPGE/FGV doctor’s degree student) and traces the statistics for specific questions like drugs, violence and car accidents. The findings show that these questions produce different results for young men and women.The main characters in these dramas are young single males, suggesting the need for more distinguished public policies according not only to age, but also by gender. The study also reveals that the magnitude of these problems changes according to the youth’s social class. Prisons concern poorer men (except for the functional illiterate) while fatal car accidents and the confessed use of drugs concern upper-class boys.
Resumo:
Includes bibliography
Resumo:
Introduction: The aim of this study was to assess the influence of curing time and power on the degree of conversion and surface microhardness of 3 orthodontic composites. Methods: One hundred eighty discs, 6 mm in diameter, were divided into 3 groups of 60 samples according to the composite used-Transbond XT (3M Unitek, Monrovia, Calif), Opal Bond MV (Ultradent, South Jordan, Utah), and Transbond Plus Color Change (3M Unitek)-and each group was further divided into 3 subgroups (n = 20). Five samples were used to measure conversion, and 15 were used to measure microhardness. A light-emitting diode curing unit with multiwavelength emission of broad light was used for curing at 3 power levels (530, 760, and 1520 mW) and 3 times (8.5, 6, and 3 seconds), always totaling 4.56 joules. Five specimens from each subgroup were ground and mixed with potassium bromide to produce 8-mm tablets to be compared with 5 others made similarly with the respective noncured composite. These were placed into a spectrometer, and software was used for analysis. A microhardness tester was used to take Knoop hardness (KHN) measurements in 15 discs of each subgroup. The data were analyzed with 2 analysis of variance tests at 2 levels. Results: Differences were found in the conversion degree of the composites cured at different times and powers (P < 0.01). The composites showed similar degrees of conversion when light cured at 8.5 seconds (80.7%) and 6 seconds (79.0%), but not at 3 seconds (75.0%). The conversion degrees of the composites were different, with group 3 (87.2%) higher than group 2 (83.5%), which was higher than group 1 (64.0%). Differences in microhardness were also found (P < 0.01), with lower microhardness at 8.5 seconds (35.2 KHN), but no difference was observed between 6 seconds (41.6 KHN) and 3 seconds (42.8 KHN). Group 3 had the highest surface microhardness (35.9 KHN) compared with group 2 (33.7 KHN) and group 1 (30.0 KHN). Conclusions: Curing time can be reduced up to 6 seconds by increasing the power, with a slight decrease in the degree of conversion at 3 seconds; the decrease has a positive effect on the surface microhardness.
Resumo:
This work focused on the synthesis of novel monomers for the design of a series of oligo(p-benzamide)s following two approaches: iterative solution synthesis and automated solid phase protocols. These approaches present a useful method to the sequence-controlled synthesis of side-chain and main-chain functionalized oligomers for the preparation of an immense variety of nanoscaffolds. The challenge in the synthesis of such materials was their modification, while maintaining the characteristic properties (physical-chemical properties, shape persistence and anisotropy). The strategy for the preparation of predictable superstructures was devote to the selective control of noncovalent interactions, monodispersity and monomer sequence. In addition to this, the structure-properties correlation of the prepared rod-like soluble materials was pointed. The first approach involved the solution-based aramide synthesis via introduction of 2,4-dimethoxybenzyl N-amide protective group via an iterative synthetic strategy The second approach focused on the implementation of the salicylic acid scaffold to introduce substituents on the aromatic backbone for the stabilization of the OPBA-rotamers. The prepared oligomers were analyzed regarding their solubility and aggregation properties by systematically changing the degree of rotational freedom of the amide bonds, side chain polarity, monomer sequence and degree of oligomerization. The syntheses were performed on a modified commercial peptide synthesizer using a combination of fluorenylmethoxycarbonyl (Fmoc) and aramide chemistry. The automated synthesis allowed the preparation of aramides with potential applications as nanoscaffolds in supramolecular chemistry, e.g. comb-like-
Resumo:
Organische Ladungstransfersysteme weisen eine Vielfalt von konkurrierenden Wechselwirkungen zwischen Ladungs-, Spin- und Gitterfreiheitsgraden auf. Dies führt zu interessanten physikalischen Eigenschaften, wie metallische Leitfähigkeit, Supraleitung und Magnetismus. Diese Dissertation beschäftigt sich mit der elektronischen Struktur von organischen Ladungstransfersalzen aus drei Material-Familien. Dabei kamen unterschiedliche Photoemissions- und Röntgenspektroskopietechniken zum Einsatz. Die untersuchten Moleküle wurden z.T. im MPI für Polymerforschung synthetisiert. Sie stammen aus der Familie der Coronene (Donor Hexamethoxycoronen HMC und Akzeptor Coronen-hexaon COHON) und Pyrene (Donor Tetra- und Hexamethoxypyren TMP und HMP) im Komplex mit dem klassischen starken Akzeptor Tetracyanoquinodimethan (TCNQ). Als dritte Familie wurden Ladungstransfersalze der k-(BEDT-TTF)2X Familie (X ist ein monovalentes Anion) untersucht. Diese Materialien liegen nahe bei einem Bandbreite-kontrollierten Mottübergang im Phasendiagramm.rnFür Untersuchungen mittels Ultraviolett-Photoelektronenspektroskopie (UPS) wurden UHV-deponierte dünne Filme erzeugt. Dabei kam ein neuer Doppelverdampfer zum Einsatz, welcher speziell für Milligramm-Materialmengen entwickelt wurde. Diese Methode wies im Ladungstransferkomplex im Vergleich mit der reinen Donor- und Akzeptorspezies energetische Verschiebungen von Valenzzuständen im Bereich weniger 100meV nach. Ein wichtiger Aspekt der UPS-Messungen lag im direkten Vergleich mit ab-initio Rechnungen.rnDas Problem der unvermeidbaren Oberflächenverunreinigungen von lösungsgezüchteten 3D-Kristallen wurde durch die Methode Hard-X-ray Photoelectron Spectroscopy (HAXPES) bei Photonenenergien um 6 keV (am Elektronenspeicherring PETRA III in Hamburg) überwunden. Die große mittlere freie Weglänge der Photoelektronen im Bereich von 15 nm resultiert in echter Volumensensitivität. Die ersten HAXPES Experimente an Ladungstransferkomplexen weltweit zeigten große chemische Verschiebungen (mehrere eV). In der Verbindung HMPx-TCNQy ist die N1s-Linie ein Fingerabdruck der Cyanogruppe im TCNQ und zeigt eine Aufspaltung und einen Shift zu höheren Bindungsenergien von bis zu 6 eV mit zunehmendem HMP-Gehalt. Umgekehrt ist die O1s-Linie ein Fingerabdruck der Methoxygruppe in HMP und zeigt eine markante Aufspaltung und eine Verschiebung zu geringeren Bindungsenergien (bis zu etwa 2,5eV chemischer Verschiebung), d.h. eine Größenordnung größer als die im Valenzbereich.rnAls weitere synchrotronstrahlungsbasierte Technik wurde Near-Edge-X-ray-Absorption Fine Structure (NEXAFS) Spektroskopie am Speicherring ANKA Karlsruhe intensiv genutzt. Die mittlere freie Weglänge der niederenergetischen Sekundärelektronen (um 5 nm). Starke Intensitätsvariationen von bestimmten Vorkanten-Resonanzen (als Signatur der unbesetzte Zustandsdichte) zeigen unmittelbar die Änderung der Besetzungszahlen der beteiligten Orbitale in der unmittelbaren Umgebung des angeregten Atoms. Damit war es möglich, präzise die Beteiligung spezifischer Orbitale im Ladungstransfermechanismus nachzuweisen. Im genannten Komplex wird Ladung von den Methoxy-Orbitalen 2e(Pi*) und 6a1(σ*) zu den Cyano-Orbitalen b3g und au(Pi*) und – in geringerem Maße – zum b1g und b2u(σ*) der Cyanogruppe transferiert. Zusätzlich treten kleine energetische Shifts mit unterschiedlichem Vorzeichen für die Donor- und Akzeptor-Resonanzen auf, vergleichbar mit den in UPS beobachteten Shifts.rn
Resumo:
This dissertation studies the geometric static problem of under-constrained cable-driven parallel robots (CDPRs) supported by n cables, with n ≤ 6. The task consists of determining the overall robot configuration when a set of n variables is assigned. When variables relating to the platform posture are assigned, an inverse geometric static problem (IGP) must be solved; whereas, when cable lengths are given, a direct geometric static problem (DGP) must be considered. Both problems are challenging, as the robot continues to preserve some degrees of freedom even after n variables are assigned, with the final configuration determined by the applied forces. Hence, kinematics and statics are coupled and must be resolved simultaneously. In this dissertation, a general methodology is presented for modelling the aforementioned scenario with a set of algebraic equations. An elimination procedure is provided, aimed at solving the governing equations analytically and obtaining a least-degree univariate polynomial in the corresponding ideal for any value of n. Although an analytical procedure based on elimination is important from a mathematical point of view, providing an upper bound on the number of solutions in the complex field, it is not practical to compute these solutions as it would be very time-consuming. Thus, for the efficient computation of the solution set, a numerical procedure based on homotopy continuation is implemented. A continuation algorithm is also applied to find a set of robot parameters with the maximum number of real assembly modes for a given DGP. Finally, the end-effector pose depends on the applied load and may change due to external disturbances. An investigation into equilibrium stability is therefore performed.
Resumo:
BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.
Resumo:
The purpose of this investigation was to develop a reliable scale to measure the social environment of hospital nursing units according to the degree of humanistic and dehumanistic behaviors as perceived by nursing staff in hospitals. The study was based on a conceptual model proposed by Jan Howard, a sociologist. After reviewing the literature relevant to personalization of care, analyzing interviews with patients in various settings, and studying biological, psychological, and sociological frames of reference, Howard proposed the following necessary conditions for humanized health care. They were the dimensions of Irreplaceability, Holistic Selves, Freedom of Action, Status Equality, Shared Decision Making and Responsibility, Empathy, and Positive Affect.^ It was proposed that a scale composed of behaviors which reflected Howard's dimensions be developed within the framework of the social environment of nursing care units in hospitals. Nursing units were chosen because hospitals are traditionally organized around nursing care units and because patients spend the majority of their time in hospitals interacting with various levels of nursing personnel.^ Approximately 180 behaviors describing both patient and nursing staff behaviors which occur on nursing units were developed. Behaviors which were believed to be humanistic as well as dehumanistic were included. The items were classified under the dimensions of Howard's model by a purposively selected sample of 42 nurses representing a broad range of education, experience, and clinical areas. Those items with a high degree of agreement, at least 50%, were placed in the questionnaire. The questionnaire consisted of 169 items including six items from the Marlowe Crowne Social Desirability Scale (Short Form).^ The questionnaire, the Social Environment Scale, was distributed to the entire 7 to 3 shift nursing staff (603) of four hospitals including a public county specialty hospital, a public county general and acute hospital, a large university affiliated hospital with all services, and a small general community hospital. Staff were asked to report on a Likert type scale how often the listed behaviors occurred on their units. Three hundred and sixteen respondents (52% of the population) participated in the study.^ An item analysis was done in which each item was examined in relationship to its correlation to its own dimension total and to the totals of the other dimensions. As a result of this analysis, three dimensions, Positive Affect, Irreplaceability, and Freedom of Action were deleted from the scale. The final scale consisted of 70 items with 26 in Shared Decision Making and Responsibility, 25 in Holistic Selves, 12 in Status Equality, and seven in Empathy. The alpha coefficient was over .800 for all scales except Empathy which was .597.^ An analysis of variance by hospital was performed on the means of each dimension of the scale. There was a statistically significant difference between hospitals with a trend for the public hospitals to score lower on the scale than the university or community hospitals. That the scale scores should be lower in crowded, understaffed public hospitals was not unexpected and reflected that the scale had some discriminating ability. These differences were still observed after adjusting for the effect of Social Desirability.^ In summary, there is preliminary evidence based on this exploratory investigation that a reliable scale based on at least four dimensions from Howard's model could be developed to measure the concept of humanistic health care in hospital settings. ^
Resumo:
The European Union has been promoting linguistic diversity for many years as one of its main educational goals. This is an element that facilitates student mobility and student exchanges between different universities and countries and enriches the education of young undergraduates. In particular, a higher degree of competence in the English language is becoming essential for engineers, architects and researchers in general, as English has become the lingua franca that opens up horizons to internationalisation and the transfer of knowledge in today’s world. Many experts point to the Integrated Approach to Contents and Foreign Languages System as being an option that has certain benefits over the traditional method of teaching a second language that is exclusively based on specific subjects. This system advocates teaching the different subjects in the syllabus in a language other than one’s mother tongue, without prioritising knowledge of the language over the subject. This was the idea that in the 2009/10 academic year gave rise to the Second Language Integration Programme (SLI Programme) at the Escuela Arquitectura Técnica in the Universidad Politécnica Madrid (EUATM-UPM), just at the beginning of the tuition of the new Building Engineering Degree, which had been adapted to the European Higher Education Area (EHEA) model. This programme is an interdisciplinary initiative for the set of subjects taught during the semester and is coordinated through the Assistant Director Office for Educational Innovation. The SLI Programme has a dual goal; to familiarise students with the specific English terminology of the subject being taught, and at the same time improve their communication skills in English. A total of thirty lecturers are taking part in the teaching of eleven first year subjects and twelve in the second year, with around 120 students who have voluntarily enrolled in a special group in each semester. During the 2010/2011 academic year the degree of acceptance and the results of the SLI Programme have been monitored. Tools have been designed to aid interdisciplinary coordination and to analyse satisfaction, such as coordination records and surveys. The results currently available refer to the first and second year and are divided into specific aspects of the different subjects involved and into general aspects of the ongoing experience.