984 resultados para Prescribed mean-curvature problem
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
Some people cannot buy products without first touching them, believing that doing so will create more assurance and information and reduce uncertainty. The international consumer marketing literature suggests an instrument to measure consumers' necessity for pohysical contact, called Need for Touch (NFT). This paper analyzes whether the Need for Touch structure is empirically consistent. Based on a literature review, we suggest six hypotheses in order to assess the nomological, convergent, and discriminant validity of the phenomenon. Departing from these, data supported four assumptions in the predicted direction. Need for Touch was associated with Need for Input and with Need for Cognition. Need for Touch was not associated with traditional marketing channels. The results also showed the dual characterization of Need for Touch as a bi-dimensional construct. The moderator effect indicated that when the consumer has a higher (vs. lower) Need for Touch autotelic score, the experiential motivation for shopping played a more (vs. less) important role in impulsive motivation. Our Study 3 supports the NFT structure and shows new associations with the need for unique products and dependent decisions.
Resumo:
Hydrogeological research usually includes some statistical studies devised to elucidate mean background state, characterise relationships among different hydrochemical parameters, and show the influence of human activities. These goals are achieved either by means of a statistical approach or by mixing modelsbetween end-members. Compositional data analysis has proved to be effective with the first approach, but there is no commonly accepted solution to the end-member problem in a compositional framework.We present here a possible solution based on factor analysis of compositions illustrated with a case study.We find two factors on the compositional bi-plot fitting two non-centered orthogonal axes to the most representative variables. Each one of these axes defines a subcomposition, grouping those variables thatlay nearest to it. With each subcomposition a log-contrast is computed and rewritten as an equilibrium equation. These two factors can be interpreted as the isometric log-ratio coordinates (ilr) of three hiddencomponents, that can be plotted in a ternary diagram. These hidden components might be interpreted as end-members.We have analysed 14 molarities in 31 sampling stations all along the Llobregat River and its tributaries, with a monthly measure during two years. We have obtained a bi-plot with a 57% of explained totalvariance, from which we have extracted two factors: factor G, reflecting geological background enhanced by potash mining; and factor A, essentially controlled by urban and/or farming wastewater. Graphicalrepresentation of these two factors allows us to identify three extreme samples, corresponding to pristine waters, potash mining influence and urban sewage influence. To confirm this, we have available analysisof diffused and widespread point sources identified in the area: springs, potash mining lixiviates, sewage, and fertilisers. Each one of these sources shows a clear link with one of the extreme samples, exceptfertilisers due to the heterogeneity of their composition.This approach is a useful tool to distinguish end-members, and characterise them, an issue generally difficult to solve. It is worth note that the end-member composition cannot be fully estimated but only characterised through log-ratio relationships among components. Moreover, the influence of each endmember in a given sample must be evaluated in relative terms of the other samples. These limitations areintrinsic to the relative nature of compositional data
Resumo:
There is almost not a case in exploration geology, where the studied data doesn’tincludes below detection limits and/or zero values, and since most of the geological dataresponds to lognormal distributions, these “zero data” represent a mathematicalchallenge for the interpretation.We need to start by recognizing that there are zero values in geology. For example theamount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-existswith nepheline. Another common essential zero is a North azimuth, however we canalways change that zero for the value of 360°. These are known as “Essential zeros”, butwhat can we do with “Rounded zeros” that are the result of below the detection limit ofthe equipment?Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimeswe need to differentiate between a sodic and a potassic alteration. Pre-classification intogroups requires a good knowledge of the distribution of the data and the geochemicalcharacteristics of the groups which is not always available. Considering the zero valuesequal to the limit of detection of the used equipment will generate spuriousdistributions, especially in ternary diagrams. Same situation will occur if we replace thezero values by a small amount using non-parametric or parametric techniques(imputation).The method that we are proposing takes into consideration the well known relationshipsbetween some elements. For example, in copper porphyry deposits, there is always agood direct correlation between the copper values and the molybdenum ones, but whilecopper will always be above the limit of detection, many of the molybdenum values willbe “rounded zeros”. So, we will take the lower quartile of the real molybdenum valuesand establish a regression equation with copper, and then we will estimate the“rounded” zero values of molybdenum by their corresponding copper values.The method could be applied to any type of data, provided we establish first theircorrelation dependency.One of the main advantages of this method is that we do not obtain a fixed value for the“rounded zeros”, but one that depends on the value of the other variable.Key words: compositional data analysis, treatment of zeros, essential zeros, roundedzeros, correlation dependency
Resumo:
The word "minimal" or "mild" hearing loss seems to imply that their effects are mild or negligible. The literature supports that they can have a significant impact on educative end educational development of young children and contribute to problems in fields of social function, communication and educational achievement. Unilateral hearing loss in children has been considered for long to be of little consequence. In fact it causes problems in speech and language development, speech understanding, especially in noisy environments, and school results. Early diagnosis, follow-up during preschool and school ages are mandatory.
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
Abstract The neo-liberal capitalist ideology has come under heavy fire with anecdotal evidence indicating a link between these same values and unethical behavior. Academic institutions reflect social values and act as socializing agents for the young. Can this explain the high and increasing rates of cheating that currently prevail in education? Our first chapter examines the question of whether self-enhancement values of power and açhievement, the individual level equivalent of neo-liberal capitalist values, predict positive attitudes towards cheating. Furthermore, we explore the mediating role of motivational factors. Results of four studies reveal that self-enhancement value endorsement predicts the adoption of performance-approach goals, a relationship mediated by introjected regulation, namely desire for social approval and that self-enhancement value endorsement also predicts the condoning of cheating, a relationship mediated by performance-approach goal adoption. However, self-transcendence values prescribed by a normatively salient source have the potential to reduce the link between self-enhancement value endorsément and attitudes towards cheating. Normative assessment constitutes a key tool used by academic institutions to socialize young people to accept the competitive, meritocratic nature of a sociéty driven by a neo-liberal capitalist ideology. As such, the manifest function of grades is to motivate students to work hard and to buy into the competing ethos. Does normative assessment fulfill these functions? Our second chapter explores the reward-intrinsic motivation question in the context of grading, arguably a high-stakes reward. In two experiments, the relative capacity of graded high performance as compared to the task autonomy experienced in an ungraded task to predict post-task intrinsic motivation is assessed. Results show that whilst the graded task performance predicts post-task appreciation, it fails to predict ongoing motivation. However, perceived autonomy experienced in non-graded condition, predicts both post-task appreciation and ongoing motivation. Our third chapter asks whether normative assessment inspires the spirit of competition in students. Results of three experimental studies reveal that expectation of a grade for a task, compared to no grade, induces greater adoption of performance-avoidance, but not performance-approach, goals. Experiment 3 provides an explanatory mechanism for this, showing that reduced autonomous motivation experienced in previous graded tasks mediates the relationship between grading and adoption of performance avoidance goals in a subsequent task. The above results, when combined, provide evidence as to the deleterious effects of self enhancement values and the associated practice of normative assessment in school on student motivation, goals and ethics. We conclude by using value and motivation theory to explore solutions to this problem.
Resumo:
RésuméL'origine de l'obésité, qui atteint des proportions épidémiques, est complexe. Elle est liée au mode de vie et au comportement des individus par rapport à l'activité physique, expression des choix individuels et de l'interaction avec l'environnement. Les mesures du comportement au niveau de l'activité physique des individus face à leur environnement, la répartition des types d'activité physique, la durée, la fréquence, l'intensité, et la dépense énergétique sont d'une grande importance. Aujourd'hui, il y a un manque de méthodes permettant une évaluation précise et objective de l'activité physique et du comportement des individus. Afin de compléter les recherches relatives à l'activité physique, à l'obésité et à certaines maladies, le premier objectif du travail de thèse était de développer un modèle pour l'identification objective des types d'activité physique dans des conditions de vie réelles et l'estimation de la dépense énergétique basée sur une combinaison de 2 accéléromètres et 1 GPS. Le modèle prend en compte qu'une activité donnée peut être accomplie de différentes façons dans la vie réelle. Les activités quotidiennes ont pu être classées en 8 catégories, de sédentaires à actives, avec une précision de 1 min. La dépense énergétique a pu peut être prédite avec précision par le modèle. Après validation du modèle, le comportement des individus de l'activité physique a été évalué dans une seconde étude. Nous avons émis l'hypothèse que, dans un environnement caractérisé par les pentes, les personnes obèses sont tentées d'éviter les pentes raides et de diminuer la vitesse de marche au cours d'une activité physique spontanée, ainsi que pendant les exercices prescrits et structurés. Nous avons donc caractérisé, par moyen du modèle développé, le comportement des individus obèses dans un environnement vallonné urbain. La façon dont on aborde un environnement valloné dans les déplacements quotidiens devrait également être considérée lors de la prescription de marche supplémentaire afin d'augmenter l'activité physique.SummaryOrigin of obesity, that reached epidemic proportion, is complex and may be linked to different lifestyle and physical activity behaviour. Measurement of physical activity behaviour of individuals towards their environment, the distribution of physical activity in terms of physical activity type, volume, duration, frequency, intensity, and energy expenditure is of great importance. Nowadays, there is a lack of methods for accurate and objective assessment of physical activity and of individuals' physical activity behaviour. In order to complement the research relating physical activity to obesity and related diseases, the first aim of the thesis work was to develop a model for objective identification of physical activity types in real-life condition and energy expenditure based on a combination of 2 accelerometers and 1 GPS device. The model takes into account that a given activity can be achieved in many different ways in real life condition. Daily activities could be classified in 8 categories, as sedentary to active physical activity, within 1 min accuracy, and physical activity patterns determined. The energy expenditure could be predicted accurately with an accuracy below 10%. Furthermore, individuals' physical activity behaviour is expression of individual choices and their interaction with the neighbourhood environment. In a second study, we hypothesized that, in an environment characterized by inclines, obese individuals are tempted to avoid steep positive slopes and to decrease walking speed during spontaneous outdoor physical activity, as well as during prescribed structured bouts of exercise. Finally, we characterized, by mean of the developed model, the physical activity behaviour of obese individuals in a hilly urban environment. Quantifying how one tackles hilly environment or avoids slope in their everyday displacements should be also considered while prescribing extra walking in free-living conditions in order to increase physical activity.
Resumo:
Stone groundwood (SGW) is a fibrous matter commonly prepared in a high yield process, and mainly used for papermaking applications. In this work, the use of SGW fibers is explored as reinforcing element of polypropylene (PP) composites. Due to its chemical and superficial features, the use of coupling agents is needed for a good adhesion and stress transfer across the fiber-matrix interface. The intrinsic strength of the reinforcement is a key parameter to predict the mechanical properties of the composite and to perform an interface analysis. The main objective of the present work was the determination of the intrinsic tensile strength of stone groundwood fibers. Coupled and non-coupled PP composites from stone groundwood fibers were prepared. The influence of the surface morphology and the quality at interface on the final properties of the composite was analyzed and compared to that of fiberglass PP composites. The intrinsic tensile properties of stone groundwood fibers, as well as the fiber orientation factor and the interfacial shear strength of the current composites were determined
Resumo:
Algoritmo que optimiza y crea pairings para tripulaciones de líneas aéreas mediante la posterior programación en Java.
Resumo:
The major objective of this problem identification document is the determination of the relative severity of traffic safety problems in each of the 99 counties. The National Highway Traffic Safety Administration and the Iowa Governor's Traffic Safety Bureau are committed to the reduction of death and injury on the nation's roads. As part of its duty in administering federal traffic safety funds in the State of Iowa, the Governor's Traffic Safety Bureau conducts a comprehensive Problem Identification update each year.
Resumo:
Escitalopram is a serotonin reuptake inhibitor prescribed for depression and anxiety. There is a paucity of information regarding safety in pregnancy. The objective of this study was to determine whether escitalopram is associated with an increased risk for major malformations or other adverse outcomes following use in pregnancy. The authors analyzed pregnancy outcomes in women exposed to escitalopram (n = 212) versus other antidepressants (n = 212) versus nonteratogenic exposures (n = 212) and compared the outcomes. Among the escitalopram exposures were 172 (81%) live births, 32 (15%) spontaneous abortions, 6 (2.8%) therapeutic abortions, 3 stillbirths (1.7%), and 3 major malformations (1.7%). The only significant differences among groups was the rate of low birth weight (<2500 g) and overall mean birth weight (P = .225). However, spontaneous abortion rates were higher in both antidepressant groups (15% and 16%) compared with controls (8.5%; P = .066). There were lower rates of live births (P = .006), lower overall birth weight (P < .001), and increased rates of low birth weight (<2500 g; P = .009) with escitalopram. Spontaneous abortion rates were nearly double in both antidepressant groups (15% and 16%) compared with controls (8.5%) but not significant (P = .066). Escitalopram does not appear to be associated with an increased risk for major malformations but appears to increase the risk for low birth weight, which was correlated with the increase in infants weighing <2500 g. In addition, the higher rates of spontaneous abortions in both antidepressant groups confirmed previous findings.
Resumo:
Habitat restoration measures may result in artificially high breeding density, for instance when nest-boxes saturate the environment, which can negatively impact species' demography. Potential risks include changes in mating and reproductive behaviour such as increased extra-pair paternity, conspecific brood parasitism, and polygyny. Under particular cicumstances, these mechanisms may disrupt reproduction, with populations dragged into an extinction vortex. With the use of nuclear microsatellite markers, we investigated the occurrence of these potentially negative effects in a recovered population of a rare secondary cavity-nesting farmland bird of Central Europe, the hoopoe (Upupa epops). High intensity farming in the study area has resulted in a total eradication of cavity trees, depriving hoopoes from breeding sites. An intensive nest-box campaign rectified this problem, resulting in a spectacular population recovery within a few years only. There was some concern, however, that the new, high artificially-induced breeding density might alter hoopoe mating and reproductive behaviour. As the species underwent a serious demographic bottleneck in the 1970-1990s, we also used the microsatellite markers to reconstitute the demo-genetic history of the population, looking in particular for signs of genetic erosion. We found i) a low occurrence of extra-pair paternity, polygyny and conspecific brood parasitism, ii) a high level of neutral genetic diversity (mean number of alleles and expected heterozygosity per locus: 13.8 and 83%, respectively) and, iii) evidence for genetic connectivity through recent immigration of individuals from well differentiated populations. The recent increase in breeding density did thus not induce so far any noticeable detrimental changes in mating and reproductive behaviour. The demographic bottleneck undergone by the population in the 1970s-1990s was furthermore not accompanied by any significant drop in neutral genetic diversity. Finally, genetic data converged with a concomitant demographic study to evidence that immigration strongly contributed to local population recovery.
Resumo:
Purpose/Objective(s): To implement a carotid dose sparing protocol using helical Tomotherapy in T1N0 squamous cell laryngeal carcinoma.Materials/Methods: Between July and August 2010, 7 men with stage T1N0 laryngeal carcinoma were included in this study. Age ranged from 47 - 74 years. Staging included endoscopic examination, CT-scan and MRI when indicated. Planned irradiation dose was 70 Gy in 35 fractions over 7 weeks. A simple treatment planning algorithm for carotid sparing was used: maximum point dose to the carotids 35 Gy, to the spinal cord 30 Gy, and 100% PTV volume to be covered with 95% of the prescribed dose. Carotid volume of interest extended to 1 cm above and below of the PTV. Doses to the carotid arteries, to the critical organs, and to the planned target volume (PTV) with our standard laryngeal irradiation protocol was compared. Daily megavoltage scans were obtained before each fraction. When necessary, the Planned Adaptive software (TomoTherapy Inc., Madison, WI) was used to evaluatethe need for a re-planning, which has never been indicated. Dose data were extracted using the VelocityAI software (Atlanta, GA), and data normalization and dose-volume histogram (DVH) interpolation were realized using the Igor Pro software (Portland, OR).Results:A significant (p\0.05) carotid dose sparing compared to our standard protocol with an average maximum point dose of 38.3 Gy (standard deviation [SD] 4.05 Gy), average mean dose of 18.59 Gy (SD 0.83 Gy) was achieved. In all patients, 95% of the carotid volume received less than 28.4 Gy (SD 0.98 Gy). The average maximum point dose to the spinal cord was 25.8 Gy (SD 3.24 Gy). PTV was fully covered with more than 95% of the prescribed dose for all patients with an average maximum point dose of 74.1 Gy and the absolute maximum dose in a single patient of 75.2 Gy. To date, the clinical outcomes have been excellent. Three patients (42%) developed stage 1 mucositis that was conservatively managed, and all the patients presented a mild to moderate dysphonia. All adverse effects resolved spontaneously in the month following the end of treatment. Early local control rate is 100% considering a 4 - 5 months post treatment follow-up.Conclusions: Helical Tomotherapy allows a clinically significant decrease of carotid irradiation dose compared to standard irradiation protocols with an acceptable spinal cord dose tradeoff. Moreover, this technique allows the PTV to be homogenously covered with a curative irradiation dose. Daily control imaging brings added security margins especially when working with high dose gradients. Further investigations and follow-up are underway to better evaluate the late clinical outcomes especially the local control rate, late laryngeal and vascular toxicity, and expected potential impact on cerebrovascular events.