977 resultados para statistical techniques
Resumo:
In the context of climate change over South America (SA) has been observed that the combination of high temperatures and rain more temperatures less rainfall, cause different impacts such as extreme precipitation events, favorable conditions for fires and droughts. As a result, these regions face growing threat of water shortage, local or generalized. Thus, the water availability in Brazil depends largely on the weather and its variations in different time scales. In this sense, the main objective of this research is to study the moisture budget through regional climate models (RCM) from Project Regional Climate Change Assessments for La Plata Basin (CLARIS-LPB) and combine these RCM through two statistical techniques in an attempt to improve prediction on three areas of AS: Amazon (AMZ), Northeast Brazil (NEB) and the Plata Basin (LPB) in past climates (1961-1990) and future (2071-2100). The moisture transport on AS was investigated through the moisture fluxes vertically integrated. The main results showed that the average fluxes of water vapor in the tropics (AMZ and NEB) are higher across the eastern and northern edges, thus indicating that the contributions of the trade winds of the North Atlantic and South are equally important for the entry moisture during the months of JJA and DJF. This configuration was observed in all the models and climates. In comparison climates, it was found that the convergence of the flow of moisture in the past weather was smaller in the future in various regions and seasons. Similarly, the majority of the SPC simulates the future climate, reduced precipitation in tropical regions (AMZ and NEB), and an increase in the LPB region. The second phase of this research was to carry out combination of RCM in more accurately predict precipitation, through the multiple regression techniques for components Main (C.RPC) and convex combination (C.EQM), and then analyze and compare combinations of RCM (ensemble). The results indicated that the combination was better in RPC represent precipitation observed in both climates. Since, in addition to showing values be close to those observed, the technique obtained coefficient of correlation of moderate to strong magnitude in almost every month in different climates and regions, also lower dispersion of data (RMSE). A significant advantage of the combination of methods was the ability to capture extreme events (outliers) for the study regions. In general, it was observed that the wet C.EQM captures more extreme, while C.RPC can capture more extreme dry climates and in the three regions studied.
Resumo:
Private Higher Education Institutions are embedded in a market where competitiveness is a key factor. To remain competitive, HEIs needs to have proactive and innovative strategies, especially to understand their main customers, students, with regard to their expectations about the quality of HEI. This study is to evaluate the overall private institutions of higher education in the city of Natal / RN, as the strategies adopted to remain on the market , based on the perceived quality of students. For conduct this research, it has developed two private institutions in the city of Natal, through the application using exploratory research to guide the survey for data collection with questionnaire to apply the overview with students, being directed to senior students courses in Bussiness, Accounting and Law. This research tool addresses aspects relevant to map the dimensions: (1) teaching, perspectives related to methods and teaching tools; ( 2 ) teachers, specifies the quality attributes related to teachers; (3 ) Infrastructure, describes the environment of the HEI; ( 4 ) services , evaluates the quality processes that attach to the HEI; and ( 5 ) intangible relates aspects with student satisfaction. The results were analyzed using descriptive statistical techniques using the Statistical Package Tool for Social Sciences (SPSS). The first stage of results characterizes the descriptive analysis of the overall sample and by HEI and course, plus a build univariate analysis of the HEI and also bivariate analysis shows that correlation of the factors through Spearman correlation coefficient. The results were used to compose a matrix of importance versus performance that compare with the contents of the Ministry of Education and Culture (MEC). Finally, these comparisons allowed identification of the most important factors for the quality of the HEI and the level of performance from institutions in the development of each attributes of quality dimensions.
Resumo:
Finite-Differences Time-Domain (FDTD) algorithms are well established tools of computational electromagnetism. Because of their practical implementation as computer codes, they are affected by many numerical artefact and noise. In order to obtain better results we propose using Principal Component Analysis (PCA) based on multivariate statistical techniques. The PCA has been successfully used for the analysis of noise and spatial temporal structure in a sequence of images. It allows a straightforward discrimination between the numerical noise and the actual electromagnetic variables, and the quantitative estimation of their respective contributions. Besides, The GDTD results can be filtered to clean the effect of the noise. In this contribution we will show how the method can be applied to several FDTD simulations: the propagation of a pulse in vacuum, the analysis of two-dimensional photonic crystals. In this last case, PCA has revealed hidden electromagnetic structures related to actual modes of the photonic crystal.
Resumo:
Consumers have relationships with other people, and they have relationships with brands similar to the ones they have with other people. Yet, very little is known about how brand and interpersonal relationships relate to one another. Even less is known about how they jointly affect consumer well-being. The goal of this research, therefore, is to examine how brand and interpersonal relationships influence and are influenced by consumer well-being. Essay 1 uses both empirical methods and surveys from individuals and couples to investigate how consumer preferences in romantic couples, namely brand compatibility, influences life satisfaction. Using traditional statistical techniques and multilevel modeling, I find that the effect of brand compatibility, or the extent to which individuals have similar brand preferences, on life satisfaction depends upon power in the relationship. For high power partners, brand compatibility has no effect on life satisfaction. On the other hand, for low power partners, low brand compatibility is associated with decreased life satisfaction. I find that conflict mediates the link between brand compatibility and power on life satisfaction. In Essay 2 I again use empirical methods and surveys to investigate how resources, which can be considered a form of consumer well-being, influence brand and interpersonal relations. Although social connections have long been considered a fundamental human motivation and deemed necessary for well-being (Baumeister and Leary 1995), recent research has demonstrated that having greater resources is associated with weaker social connections. In the current research I posit that individuals with greater resources still have a need to connect and are using other sources for connection, namely brands. Across several studies I test and find support for my theory that resource level shifts the preference of social connection from people to brands. Specifically, I find that individuals with greater resources have stronger brand relationships, as measured by self-brand connection, brand satisfaction, purchase intentions and willingness to pay with both existing brand relationships and with new brands. This suggests that individuals with greater resources place more emphasis on these relationships. Furthermore, I find that resource level influences the stated importance of brand and interpersonal relationships, and that having or perceiving greater resources is associated with an increased preference to engage with brands over people. This research demonstrates that there are times when people prefer and seek out connections with brands over other people, and highlights the ways in which our brand and interpersonal relationships influence one another.
Resumo:
An abstract of a thesis devoted to using helix-coil models to study unfolded states.\\
Research on polypeptide unfolded states has received much more attention in the last decade or so than it has in the past. Unfolded states are thought to be implicated in various
misfolding diseases and likely play crucial roles in protein folding equilibria and folding rates. Structural characterization of unfolded states has proven to be
much more difficult than the now well established practice of determining the structures of folded proteins. This is largely because many core assumptions underlying
folded structure determination methods are invalid for unfolded states. This has led to a dearth of knowledge concerning the nature of unfolded state conformational
distributions. While many aspects of unfolded state structure are not well known, there does exist a significant body of work stretching back half a century that
has been focused on structural characterization of marginally stable polypeptide systems. This body of work represents an extensive collection of experimental
data and biophysical models associated with describing helix-coil equilibria in polypeptide systems. Much of the work on unfolded states in the last decade has not been devoted
specifically to the improvement of our understanding of helix-coil equilibria, which arguably is the most well characterized of the various conformational equilibria
that likely contribute to unfolded state conformational distributions. This thesis seeks to provide a deeper investigation of helix-coil equilibria using modern
statistical data analysis and biophysical modeling techniques. The studies contained within seek to provide deeper insights and new perspectives on what we presumably
know very well about protein unfolded states. \\
Chapter 1 gives an overview of recent and historical work on studying protein unfolded states. The study of helix-coil equilibria is placed in the context
of the general field of unfolded state research and the basics of helix-coil models are introduced.\\
Chapter 2 introduces the newest incarnation of a sophisticated helix-coil model. State of the art modern statistical techniques are employed to estimate the energies
of various physical interactions that serve to influence helix-coil equilibria. A new Bayesian model selection approach is utilized to test many long-standing
hypotheses concerning the physical nature of the helix-coil transition. Some assumptions made in previous models are shown to be invalid and the new model
exhibits greatly improved predictive performance relative to its predecessor. \\
Chapter 3 introduces a new statistical model that can be used to interpret amide exchange measurements. As amide exchange can serve as a probe for residue-specific
properties of helix-coil ensembles, the new model provides a novel and robust method to use these types of measurements to characterize helix-coil ensembles experimentally
and test the position-specific predictions of helix-coil models. The statistical model is shown to perform exceedingly better than the most commonly used
method for interpreting amide exchange data. The estimates of the model obtained from amide exchange measurements on an example helical peptide
also show a remarkable consistency with the predictions of the helix-coil model. \\
Chapter 4 involves a study of helix-coil ensembles through the enumeration of helix-coil configurations. Aside from providing new insights into helix-coil ensembles,
this chapter also introduces a new method by which helix-coil models can be extended to calculate new types of observables. Future work on this approach could potentially
allow helix-coil models to move into use domains that were previously inaccessible and reserved for other types of unfolded state models that were introduced in chapter 1.
Resumo:
Highlights of Data Expedition: • Students explored daily observations of local climate data spanning the past 35 years. • Topological Data Analysis, or TDA for short, provides cutting-edge tools for studying the geometry of data in arbitrarily high dimensions. • Using TDA tools, students discovered intrinsic dynamical features of the data and learned how to quantify periodic phenomenon in a time-series. • Since nature invariably produces noisy data which rarely has exact periodicity, students also considered the theoretical basis of almost-periodicity and even invented and tested new mathematical definitions of almost-periodic functions. Summary The dataset we used for this data expedition comes from the Global Historical Climatology Network. “GHCN (Global Historical Climatology Network)-Daily is an integrated database of daily climate summaries from land surface stations across the globe.” Source: https://www.ncdc.noaa.gov/oa/climate/ghcn-daily/ We focused on the daily maximum and minimum temperatures from January 1, 1980 to April 1, 2015 collected from RDU International Airport. Through a guided series of exercises designed to be performed in Matlab, students explore these time-series, initially by direct visualization and basic statistical techniques. Then students are guided through a special sliding-window construction which transforms a time-series into a high-dimensional geometric curve. These high-dimensional curves can be visualized by projecting down to lower dimensions as in the figure below (Figure 1), however, our focus here was to use persistent homology to directly study the high-dimensional embedding. The shape of these curves has meaningful information but how one describes the “shape” of data depends on which scale the data is being considered. However, choosing the appropriate scale is rarely an obvious choice. Persistent homology overcomes this obstacle by allowing us to quantitatively study geometric features of the data across multiple-scales. Through this data expedition, students are introduced to numerically computing persistent homology using the rips collapse algorithm and interpreting the results. In the specific context of sliding-window constructions, 1-dimensional persistent homology can reveal the nature of periodic structure in the original data. I created a special technique to study how these high-dimensional sliding-window curves form loops in order to quantify the periodicity. Students are guided through this construction and learn how to visualize and interpret this information. Climate data is extremely complex (as anyone who has suffered from a bad weather prediction can attest) and numerous variables play a role in determining our daily weather and temperatures. This complexity coupled with imperfections of measuring devices results in very noisy data. This causes the annual seasonal periodicity to be far from exact. To this end, I have students explore existing theoretical notions of almost-periodicity and test it on the data. They find that some existing definitions are also inadequate in this context. Hence I challenged them to invent new mathematics by proposing and testing their own definition. These students rose to the challenge and suggested a number of creative definitions. While autocorrelation and spectral methods based on Fourier analysis are often used to explore periodicity, the construction here provides an alternative paradigm to quantify periodic structure in almost-periodic signals using tools from topological data analysis.
Resumo:
Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.
Resumo:
Forests change with changes in their environment based on the physiological responses of individual trees. These short-term reactions have cumulative impacts on long-term demographic performance. For a tree in a forest community, success depends on biomass growth to capture above- and belowground resources and reproductive output to establish future generations. Here we examine aspects of how forests respond to changes in moisture and light availability and how these responses are related to tree demography and physiology.
First we address the long-term pattern of tree decline before death and its connection with drought. Increasing drought stress and chronic morbidity could have pervasive impacts on forest composition in many regions. We use long-term, whole-stand inventory data from southeastern U.S. forests to show that trees exposed to drought experience multiyear declines in growth prior to mortality. Following a severe, multiyear drought, 72% of trees that did not recover their pre-drought growth rates died within 10 years. This pattern was mediated by local moisture availability. As an index of morbidity prior to death, we calculated the difference in cumulative growth after drought relative to surviving conspecifics. The strength of drought-induced morbidity varied among species and was correlated with species drought tolerance.
Next, we investigate differences among tree species in reproductive output relative to biomass growth with changes in light availability. Previous studies reach conflicting conclusions about the constraints on reproductive allocation relative to growth and how they vary through time, across species, and between environments. We test the hypothesis that canopy exposure to light, a critical resource, limits reproductive allocation by comparing long-term relationships between reproduction and growth for trees from 21 species in forests throughout the southeastern U.S. We found that species had divergent responses to light availability, with shade-intolerant species experiencing an alleviation of trade-offs between growth and reproduction at high light. Shade-tolerant species showed no changes in reproductive output across light environments.
Given that the above patterns depend on the maintenance of transpiration, we next developed an approach for predicting whole-tree water use from sap flux observations. Accurately scaling these observations to tree- or stand-levels requires accounting for variation in sap flux between wood types and with depth into the tree. We compared different models with sap flux data to test the hypotheses that radial sap flux profiles differ by wood type and tree size. We show that radial variation in sap flux is dependent on wood type but independent of tree size for a range of temperate trees. The best-fitting model predicted out-of-sample sap flux observations and independent estimates of sapwood area with small errors, suggesting robustness in new settings. We outline a method for predicting whole-tree water use with this model and include computer code for simple implementation in other studies.
Finally, we estimated tree water balances during drought with a statistical time-series analysis. Moisture limitation in forest stands comes predominantly from water use by the trees themselves, a drought-stand feedback. We show that drought impacts on tree fitness and forest composition can be predicted by tracking the moisture reservoir available to each tree in a mass balance. We apply this model to multiple seasonal droughts in a temperate forest with measurements of tree water use to demonstrate how species and size differences modulate moisture availability across landscapes. As trees deplete their soil moisture reservoir during droughts, a transpiration deficit develops, leading to reduced biomass growth and reproductive output.
This dissertation draws connections between the physiological condition of individual trees and their behavior in crowded, diverse, and continually-changing forest stands. The analyses take advantage of growing data sets on both the physiology and demography of trees as well as novel statistical techniques that allow us to link these observations to realistic quantitative models. The results can be used to scale up tree measurements to entire stands and address questions about the future composition of forests and the land’s balance of water and carbon.
Resumo:
The principal purpose of this research was to investigate discriminant factors of survival and failure of micro and small businesses, and the impacts of these factors in the public politics for entrepreneurship in the State of Rio Grande do Norte. The data were ceded by SEBRAE/RN and the Commercial Committee of the Rio Grande do Norte State and it included the businesses that were registered in 2000, 2001 and 2002. According to the theoretical framework 3 groups of factors were defined Business Financial Structure, Entrepreneurial Preparation and Entrepreneurial Behavior , and the factors were studied in order to determine whether they are discriminant or not of the survival and business failure. A quantitative research was applied and advanced statistical techniques were used multivariate data analysis , beginning with the factorial analysis and after using the discriminant analysis. As a result, canonical discriminant functions were found and they partially explained the survival and business failure in terms of the factors and groups of factors. The analysis also permitted the evaluation of the public politics for entrepreneurship and it was verified, according to the view of the entrepreneurs, that these politics were weakly effective to avoid business failure. Some changes in the referred politics were suggested based on the most significant factors found.
Resumo:
Self-efficacy, the construct developed by Albert Bandura in 1977 and widely studied around the world, means the individual's belief in his own capacity to successfully perform a certain activity. This study aims to determine the degree of association between sociodemographic characteristics and professional training to the levels of Self-Efficacy at Work (SEW) of the Administrative Assistants in a federal university. This is a descriptive research submitted to and approved by the Ethics Committee of UFRN. The method of data analysis, in quantitative nature, was accomplished with the aid of the statistical programs R and Minitab. The instrument used in research was a sociodemographic data questionnaire, variables of professional training and the General Perception of Self-efficacy Scale (GPSES), applied to the sample by 289 Assistants in Administration. Statistical techniques for data analysis were descriptive statistics, cluster analysis, reliability test (Cronbach's alpha), and test of significance (Pearson). Results show a sociodemographic profile of Assistants in Administration of UFRN with well-distributed characteristics, with 48.4% men and 51.6% female; 59.9% of them were aged over 40 years, married (49.3%), color or race white (58%) and Catholics (67.8%); families are composed of up to four people (75.8%) with children (59.4%) of all age groups; the occupation of the mothers of these professionals is mostly housewives (51.6%) with high school education up to parents (72%) and mothers (75.8%). Assistants in Administration have high levels of professional training, most of them composed two groups of servers: the former, recently hired public servants (30.7%) and another with long service (59%), the majority enter young in career and it stays until retirement, 72.4% of these professionals have training above the minimum requirement for the job. The analysis of SEW levels shows medium to high levels for 72% of assistants in administration; low SEWclassified people have shown a high average of 2.7, considered close to the overall mean presented in other studies, which is 2.9. The cluster analysis has allowed us to say that the characteristics of the three groups (Low, Medium and High SEW) are similar and can be found in the three levels of SEW representatives with all the characteristics investigated. The results indicate no association between the sociodemographic variables and professional training to the levels of self-efficacy at work of Assistants in Administration of UFRN, except for the variable color or race. However, due to the small number of people who declared themselves in color or black race (4% of the sample), this result can be interpreted as mere coincidence or the black people addressed in this study have provided a sense of efficacy higher than white and brown ones. The study has corroborated other studies and highlighted the subjectivity of the self-efficacy construct. They are needed more researches, especially with public servants for the continuity and expansion of studies on the subject, making it possible to compare and confirm the results
Resumo:
This study shows the results of an exploratory-descriptive research that aimed to identify the latent dimensions of communication, as well as finding relations between such dimensions and organizational image. The sample came to a total of 267 respondents, being 89 managers or owners and 178 salespeople of clothing and footwear stores that are situated in the main five shopping centers located in Natal, capital of Rio Grande do Norte. The collection of the data was made by the use of two structuralized and validated instruments, being the answers measured in the likert scale of 6 points. For the measurement of communication it was used the instrument developed by Downs and Hazen (2002), made up of 8 latent dimensions and 32 indicators. For the image it was used the model of Mael and Ashforth (1992) that contains 5 indicators. The analysis of the data was made through of the use of statistical techniques of factorial analysis and structural equations modeling. The results of the factorial analysis demonstrated communication as being formed by five latent dimensions. The modeling, on the other hand, demonstrated to exist positive relations between communication and organizational image, whose results revealed that the image is influenced by the communication with the supervisor, by the organizational integration and as being stronger explained by the vertical communication
Resumo:
In the last thirty years, the emergence and progression of biologging technology has led to great advances in marine predator ecology. Large databases of location and dive observations from biologging devices have been compiled for an increasing number of diving predator species (such as pinnipeds, sea turtles, seabirds and cetaceans), enabling complex questions about animal activity budgets and habitat use to be addressed. Central to answering these questions is our ability to correctly identify and quantify the frequency of essential behaviours, such as foraging. Despite technological advances that have increased the quality and resolution of location and dive data, accurately interpreting behaviour from such data remains a challenge, and analytical methods are only beginning to unlock the full potential of existing datasets. This review evaluates both traditional and emerging methods and presents a starting platform of options for future studies of marine predator foraging ecology, particularly from location and two-dimensional (time-depth) dive data. We outline the different devices and data types available, discuss the limitations and advantages of commonly-used analytical techniques, and highlight key areas for future research. We focus our review on pinnipeds - one of the most studied taxa of marine predators - but offer insights that will be applicable to other air-breathing marine predator tracking studies. We highlight that traditionally-used methods for inferring foraging from location and dive data, such as first-passage time and dive shape analysis, have important caveats and limitations depending on the nature of the data and the research question. We suggest that more holistic statistical techniques, such as state-space models, which can synthesise multiple track, dive and environmental metrics whilst simultaneously accounting for measurement error, offer more robust alternatives. Finally, we identify a need for more research to elucidate the role of physical oceanography, device effects, study animal selection, and developmental stages in predator behaviour and data interpretation.
Resumo:
This paper presents the development and evaluation of PICTOAPRENDE, which is an interactive software designed to improve oral communication. Additionally, it contributes to the development of children and youth who are diagnosed with autism spectrum disorder (ASD) in Ecuador. To fulfill this purpose initially analyzes the intervention area where the general characteristics of people with ASD and their status in Ecuador is described. Statistical techniques used for this evaluation constitutes the basis of this study. A section that presents the development of research-based cognitive and social parameters of the area of intervention is also shown. Finally, the algorithms to obtain the measurements and experimental results along with the analysis of them are presented.
Resumo:
The Gorleben salt dome is actually investigated for its suitability as a repository for radioactive waste. It is crossed by a subglacial drainage channel, formed during the Elsterian glaciation (Gorleben channel). Some units of its filling vary strongly in niveau and thickness. Lowest positions and/or largest thickness are found above the salt dome. This is interpreted as a result of subrosion during the Saalean glaciation. The rate can be calculated from niveau differences of sediments formed during the Holsteinian interglacial. However, their position might have been influenced by other factors also (relief of the channel bottom, glacial tectonics, settlement of underlying clay-rich sediments). Their relevance was estimated applying statistical techniques to niveau and thickness data from 79 drillings in the Gorleben channel. Two classes of drillings with features caused by either Saalean subrosion or sedimentary processes during the filling of the Gorleben channel can be distinguished by means of factor and discriminant analysis. This interpretation is supported by the results of classwise correlation and regression analysis. Effects of glacial tectonics on the position of Holsteinian sediments cannot be misunderstood as subrosional. The influence of the settlement of underlying clay sediments can be estimated quantitatively. Saalean subrosion rates calculated from niveau differences of Holsteinian sediments between both classes differ with respect to the method applied: maximum values are 0,83 or 0,96 mm/a, average values are 0,31 or 0,41 mm/a.
Resumo:
The aim of this research is twofold: Firstly, to model and solve a complex nurse scheduling problem with an integer programming formulation and evolutionary algorithms. Secondly, to detail a novel statistical method of comparing and hence building better scheduling algorithms by identifying successful algorithm modifications. The comparison method captures the results of algorithms in a single figure that can then be compared using traditional statistical techniques. Thus, the proposed method of comparing algorithms is an objective procedure designed to assist in the process of improving an algorithm. This is achieved even when some results are non-numeric or missing due to infeasibility. The final algorithm outperforms all previous evolutionary algorithms, which relied on human expertise for modification.