956 resultados para Non-model organism


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aging process of alcoholic beverages is generally conducted in wood barrels made with species from Quercus sp. Due to the high cost and the lack of viability of commercial production of these trees in Brazil, there is demand for new alternatives to using other native species and the incorporation of new technologies that enable greater competitiveness of sugar cane spirit aged in Brazilian wood. The drying of wood, the thermal treatment applied to it, and manufacturing techniques are important tools in defining the sensory quality of alcoholic beverages after being placed in contact with the barrels. In the thermal treatment, several compounds are changed by the application of heat to the wood and various studies show the compounds are modified, different aromas are developed, there is change in color, and beverages achieve even more pleasant taste, when compared to non-treated woods. This study evaluated the existence of significant differences between hydro-alcoholic solutions of sugar cane spirits elaborated from different species of thermo-treated and non-treated wood in terms of aroma. An acceptance test was applied to evaluate the solutions preferred by tasters under specific test conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research examines the concept of social entrepreneurship which is a fairly new business model. In the field of business it has become increasingly popular in recent years. The growing awareness of the environment and concrete examples of impact created by social entrepreneurship have encouraged entrepreneurs to address social problems. Society’s failures are tried to redress as a result of business activities. The purpose of doing business is necessarily no longer generating just profits but business is run in order to make a social change with the profit gained from the operations. Successful social entrepreneurship requires a specific nature, constant creativity and strong desire to make a social change. It requires constant balancing between two major objectives: both financial and non-financial issues need to be considered, but not at the expense of another. While aiming at the social purpose, the business needs to be run in highly competitive markets. Therefore, both factors need equally be integrated into an organization as they are complementary, not exclusionary. Business does not exist without society and society cannot go forward without business. Social entrepreneurship, its value creation, measurement tools and reporting practices are under discussion in this research. An extensive theoretical basis is covered and used to support the findings coming out of the researched case enterprises. The most attention is focused on the concept of Social Return on Investment. The case enterprises are analyzed through the SROI process. Social enterprises are mostly small or medium sized. Naturally this sets some limitations in implementing measurement tools. The question of resources requires the most attention and therefore sets the biggest constraints. However, the size of the company does not determine all – the nature of business and the type of social purpose need to be considered always. The mission may be so concrete and transparent that in all cases any kind of measurement would be useless. Implementing measurement tools may be of great benefit – or a huge financial burden. Thus, the very first thing to carefully consider is the possible need of measuring value creation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optimal challenge occurs when an individual perceives the challenge of the task to be equaled or matched by his or her own skill level (Csikszentmihalyi, 1990). The purpose of this study was to test the impact of the OPTIMAL model on physical education students' motivation and perceptions of optimal challenge across four games categories (i. e. target, batting/fielding, net/wall, invasion). Enjoyment, competence, student goal orientation and activity level were examined in relation to the OPTIMAL model. A total of 22 (17 M; 5 F) students and their parents provided informed consent to take part in the study and were taught four OPTIMAL lessons and four non-OPTIMAL lessons ranging across the four different games categories by their own teacher. All students completed the Task and Ego in Sport Questionnaire (TEOSQ; Duda & Whitehead, 1998), the Intrinsic Motivation Inventory (IMI; McAuley, Duncan, & Tanmien, 1987) and the Children's Perception of Optimal Challenge Instrument (CPOCI; Mandigo, 2001). Sixteen students (two each lesson) were observed by using the System for Observing Fitness Instruction Time tool (SOFTT; McKenzie, 2002). As well, they participated in a structured interview which took place after each lesson was completed. Quantitative results concluded that no overall significant difference was found in motivational outcomes when comparing OPTIMAL and non-OPTIMAL lessons. However, when the lessons were broken down into games categories, significant differences emerged. Levels of perceived competence were found to be higher in non-OPTIMAL batting/fielding lessons compared to OPTIMAL lessons, whereas levels of enjoyment and perceived competence were found to be higher in OPTIMAL invasion lessons in comparison to non-OPTIMAL invasion lessons. Qualitative results revealed significance in feehngs of skill/challenge balance, enjoyment and competence in the OPTIMAL lessons. Moreover, a significance of practically twice the active movement time percentage was found in OPTIMAL lessons in comparison to non-OPTIMAL lessons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Higher plants have evolved a well-conserved set of photoprotective mechanisms, collectively designated Non-Photochemical Quenching of chlorophyll fluorescence (qN), to deal with the inhibitory absorption of excess light energy by the photosystems. Their main contribution originates from safe thermal deactivation of excited states promoted by a highly-energized thylakoid membrane, detected via lumen acidification. The precise origins of this energy- or LlpH-dependent quenching (qE), arising from either decreased energy transfer efficiency in PSII antennae (~ Young & Frank, 1996; Gilmore & Yamamoto, 1992; Ruban et aI., 1992), from alternative electron transfer pathways in PSII reaction centres (~ Schreiber & Neubauer, 1990; Thompson &Brudvig, 1988; Klimov et aI., 1977), or from both (Wagner et aI., 1996; Walters & Horton, 1993), are a source of considerable controversy. In this study, the origins of qE were investigated in spinach thylakoids using a combination of fluorescence spectroscopic techniques: Pulse Amplitude Modulated (PAM) fluorimetry, pump-probe fluorimetry for the measurement of PSII absorption crosssections, and picosecond fluorescence decay curves fit to a kinetic model for PSII. Quenching by qE (,..,600/0 of maximal fluorescence, Fm) was light-induced in circulating samples and the resulting pH gradient maintained during a dark delay by the lumenacidifying capabilities of thylakoid membrane H+ ATPases. Results for qE were compared to those for the addition of a known antenna quencher, 5-hydroxy-1,4naphthoquinone (5-0H-NQ), titrated to achieve the same degree of Fm quenching as for qE. Quenching of the minimal fluorescence yield, F0' was clear (8 to 130/0) during formation of qE, indicative of classical antenna quenching (Butler, 1984), although the degree was significantly less than that achieved by addition of 5-0H-NQ. Although qE induction resulted in an overall increase in absorption cross-section, unlike the decrease expected for antenna quenchers like the quinone, a larger increase in crosssection was observed when qE induction was attempted in thylakoids with collapsed pH gradients (uncoupled by nigericin), in the absence of xanthophyll cycle operation (inhibited by DTT), or in the absence of quenching (LlpH not maintained in the dark due to omission of ATP). Fluorescence decay curves exhibited a similar disparity between qE-quenched and 5-0H-NQ-quenched thylakoids, although both sets showed accelerated kinetics in the fastest decay components at both F0 and Fm. In addition, the kinetics of dark-adapted thylakoids were nearly identical to those in qEquenched samples at F0' both accelerated in comparison with thylakoids in which the redox poise of the Oxygen-Evolving Complex was randomized by exposure to low levels of background light (which allowed appropriate comparison with F0 yields from quenched samples). When modelled with the Reversible Radical Pair model for PSII (Schatz et aI., 1988), quinone quenching could be sufficiently described by increasing only the rate constant for decay in the antenna (as in Vasil'ev et aI., 1998), whereas modelling of data from qE-quenched thylakoids required changes in both the antenna rate constant and in rate constants for the reaction centre. The clear differences between qE and 5-0H-NQ quenching demonstrated that qE could not have its origins in the antenna alone, but is rather accompanied by reaction centre quenching. Defined mechanisms of reaction centre quenching are discussed, also in relation to the observed post-quenching depression in Fm associated with photoinhibition.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Dyslipidemia is recognized as a major cause of coronary heart disease (CHD). Emerged evidence suggests that the combination of triglycerides (TG) and waist circumference can be used to predict the risk of CHD. However, considering the known limitations of TG, non-high-density lipoprotein (non-HDL = Total cholesterol - HDL cholesterol) cholesterol and waist circumference model may be a better predictor of CHD. PURPOSE: The Framingham Offspring Study data were used to determine if combined non-HDL cholesterol and waist circumference is equivalent to or better than TG and waist circumference (hypertriglyceridemic waist phenotype) in predicting risk of CHD. METHODS: A total of3,196 individuals from Framingham Offspring Study, aged ~ 40 years old, who fasted overnight for ~ 9 hours, and had no missing information on nonHDL cholesterol, TG levels, and waist circumference measurements, were included in the analysis. Receiver Operator Characteristic Curve (ROC) Area Under the Curve (AUC) was used to compare the predictive ability of non-HDL cholesterol and waist circumference and TG and waist circumference. Cox proportional-hazards models were used to examine the association between the joint distributions of non-HDL cholesterol, waist circumference, and non-fatal CHD; TG, waist circumference, and non-fatal CHD; and the joint distribution of non-HDL cholesterol and TG by waist circumference strata, after adjusting for age, gender, smoking, alcohol consumption, diabetes, and hypertension status. RESULTS: The ROC AUC associated with non-HDL cholesterol and waist circumference and TG and waist circumference are 0.6428 (CI: 0.6183, 0.6673) and 0.6299 (CI: 0.6049, 0.6548) respectively. The difference in the ROC AVC is 1.29%. The p-value testing if the difference in the ROC AVCs between the two models is zero is 0.10. There was a strong positive association between non-HDL cholesterol and the risk for non-fatal CHD within each TO levels than that for TO levels within each level of nonHDL cholesterol, especially in individuals with high waist circumference status. CONCLUSION: The results suggest that the model including non-HDL cholesterol and waist circumference may be superior at predicting CHD compared to the model including TO and waist circumference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dehumanizing ideologies that explicitly liken other humans to “inferior” animals can have negative consequences for intergroup attitudes and relations. Surprisingly, very little is known about the causes of dehumanization, and essentially no research has examined strategies for reducing dehumanizing tendencies. The Interspecies Model of Prejudice specifies that animalistic dehumanization may be rooted in basic hierarchical beliefs regarding human superiority over animals. This theoretical reasoning suggests that narrowing the human-animal divide should also reduce dehumanization. The purpose of the present dissertation, therefore, was to gain a more complete understanding of the predictors of and solutions to dehumanization by examining the Interspecies Model of Prejudice, first from a layperson’s perspective and then among young children. In Study 1, laypeople strongly rejected the human-animal divide as a probable cause of, or solution to, dehumanization, despite evidence that their own personal beliefs in the human-animal divide positively predicted their dehumanization (and prejudice) scores. From Study 1, it was concluded that the human-animal divide, despite being a robust empirical predictor of dehumanization, is largely unrecognized as a probable cause of, or solution to, dehumanization by non-experts in the psychology of prejudice. Studies 2 and 3 explored the expression of dehumanization, as well as the Interspecies Model of Prejudice, among children ages six to ten years (Studies 2 and 3) and parents (Study 3). Across both studies, White children showed evidence of racial dehumanization by attributing a Black child target fewer “uniquely human” characteristics than the White child target, representing the first systematic evidence of racial dehumanization among children. In Study 3, path analyses supported the Interspecies Model of Prejudice among children. Specifically, children’s beliefs in the human-animal divide predicted greater racial prejudice, an effect explained by heightened racial dehumanization. Moreover, parents’ Social Dominance Orientation (preference for social hierarchy and inequality) positively predicted children’s human-animal divide beliefs. Critically, these effects remained significant even after controlling for established predictors of child-prejudice (i.e., parent prejudice, authoritarian parenting, and social-cognitive skills) and relevant child demographics (i.e., age and sex). Similar patterns emerged among parent participants, further supporting the Interspecies Model of Prejudice. Encouragingly, children reported narrower human-animal divide perceptions after being exposed to an experimental prime (versus control) that highlighted the similarities among humans and animals. Together the three studies reported in this dissertation offer important and novel contributions to the dehumanization and prejudice literature. Not only did we find the first systematic evidence of racial dehumanization among children, we established the human-animal divide as a meaningful dehumanization precursor. Moreover, empirical support was obtained for the Interspecies Model of Prejudice among diverse samples including university students (Study 1), children (Studies 2 and 3), and adult-aged samples (Study 3). Importantly, each study also highlights the promising social implication of targeting the human-animal divide in interventions to reduce dehumanization and other prejudicial processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diatoms are renowned for their robust ability to perform NPQ (Non-Photochemical Quenching of chlorophyll fluorescence) as a dissipative response to heightened light stress on photosystem II, plausibly explaining their dominance over other algal groups in turbulent light environs. Their NPQ mechanism has been principally attributed to a xanthophyll cycle involving the lumenal pH regulated reversible de-epoxidation of diadinoxanthin. The principal goal of this dissertation is to reveal the physiological and physical origins and consequences of the NPQ response in diatoms during short-term transitions to excessive irradiation. The investigation involves diatom species from different originating light environs to highlight the diversity of diatom NPQ and to facilitate the detection of core mechanisms common among the diatoms as a group. A chiefly spectroscopic approach was used to investigate NPQ in diatom cells. Prime methodologies include: the real time monitoring of PSII excitation and de-excitation pathways via PAM fluorometry and pigment interconversion via transient absorbance measurements, the collection of cryogenic absorbance spectra to measure pigment energy levels, and the collection of cryogenic fluorescence spectra and room temperature picosecond time resolved fluorescence decay spectra to study excitation energy transfer and dissipation. Chemical inhibitors that target the trans-thylakoid pH gradient, the enzyme responsible for diadinoxanthin de-epoxidation, and photosynthetic electron flow were additionally used to experimentally manipulate the NPQ response. Multifaceted analyses of the NPQ responses from two previously un-photosynthetically characterised species, Nitzschia curvilineata and Navicula sp., were used to identify an excitation pressure relief ‘strategy’ for each species. Three key areas of NPQ were examined: (i) the NPQ activation/deactivation processes, (ii) how NPQ affects the collection, dissipation, and usage of absorbed light energy, and (iii) the interdependence of NPQ and photosynthetic electron flow. It was found that Nitzschia cells regulate excitation pressure via performing a high amplitude, reversible antenna based quenching which is dependent on the de-epoxidation of diadinoxanthin. In Navicula cells excitation pressure could be effectively regulated solely within the PSII reaction centre, whilst antenna based, diadinoxanthin de-epoxidation dependent quenching was implicated to be used as a supplemental, long-lasting source of excitation energy dissipation. These strategies for excitation balance were discussed in the context of resource partitioning under these species’ originating light climates. A more detailed investigation of the NPQ response in Nitzschia was used to develop a comprehensive model describing the mechanism for antenna centred non-photochemical quenching in this species. The experimental evidence was strongly supportive of a mechanism whereby: an acidic lumen triggers the diadinoxanthin de-epoxidation and protonation mediated aggregation of light harvesting complexes leading to the formation of quencher chlorophyll a-chlorophyll a dimers with short-lived excited states; quenching relaxes when a rise in lumen pH triggers the dispersal of light harvesting complex aggregates via deprotonation events and the input of diadinoxanthin. This model may also be applicable for describing antenna based NPQ in other diatom species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary goal was to test a mediated-moderation model in which dispositional optimism was the moderator and its role was mediated by problem-focused coping. A secondary goal was to demonstrate that posttraumatic growth could be differentiated from maturation and normal development. Two groups of participants were recruited and completed questionnaires twice with a 60-day interval: One group (Trauma), described a traumatic experience and the second group (Non-trauma), described a significant experience. Contrary to the hypothesis, only problem-focused coping and deliberate rumination predicted posttraumatic growth, and these findings were only observed in concurrent analyses. Furthermore, the results indicated that there was no significant difference between groups on growth scores at either Time 1 or Time 2. The findings suggest that the term “posttraumatic growth” may refer to the context in which growth occurs rather than to some developmental process that uniquely follows trauma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Small aggressive non-small cell lung carcinomas (SA-NSCLC) are characterized by spread to distant lymph nodes and metastases, even while the primary tumour remains small in size, as opposed to tumours that are relatively large before cancer progression. These small aggressive cancers present a challenge for clinical diagnosis and screening, carry grave prognosis, and may benefit from using a targeted approach to identify high-risk individuals. The objectives of this thesis were to identify factors associated with SA-NSCLC, and compare survivorship of stage IV SA-NSCLC to large stage IV NSCLC. Logistic and Cox regression analysis were performed using data from the National Lung Screening Trial (NLST). Model building was guided by knowledge of lung carcinogenesis and lung cancer prognostic factors. Previous diagnosis of emphysema and positive family history of lung cancer in females were associated with increased risk of SA-NSCLC among adenocarcinomas. Despite overall poor prognosis, SA-NSCLC have a better prognosis compared to large stage IV NSCLC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper assesses the empirical performance of an intertemporal option pricing model with latent variables which generalizes the Hull-White stochastic volatility formula. Using this generalized formula in an ad-hoc fashion to extract two implicit parameters and forecast next day S&P 500 option prices, we obtain similar pricing errors than with implied volatility alone as in the Hull-White case. When we specialize this model to an equilibrium recursive utility model, we show through simulations that option prices are more informative than stock prices about the structural parameters of the model. We also show that a simple method of moments with a panel of option prices provides good estimates of the parameters of the model. This lays the ground for an empirical assessment of this equilibrium model with S&P 500 option prices in terms of pricing errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We characterize the solution to a model of consumption smoothing using financing under non-commitment and savings. We show that, under certain conditions, these two different instruments complement each other perfectly. If the rate of time preference is equal to the interest rate on savings, perfect smoothing can be achieved in finite time. We also show that, when random revenues are generated by periodic investments in capital through a concave production function, the level of smoothing achieved through financial contracts can influence the productive investment efficiency. As long as financial contracts cannot achieve perfect smoothing, productive investment will be used as a complementary smoothing device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the relationship between the risk premium on the S&P 500 index return and its conditional variance. We use the SMEGARCH - Semiparametric-Mean EGARCH - model in which the conditional variance process is EGARCH while the conditional mean is an arbitrary function of the conditional variance. For monthly S&P 500 excess returns, the relationship between the two moments that we uncover is nonlinear and nonmonotonic. Moreover, we find considerable persistence in the conditional variance as well as a leverage effect, as documented by others. Moreover, the shape of these relationships seems to be relatively stable over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose exact likelihood-based mean-variance efficiency tests of the market portfolio in the context of Capital Asset Pricing Model (CAPM), allowing for a wide class of error distributions which include normality as a special case. These tests are developed in the frame-work of multivariate linear regressions (MLR). It is well known however that despite their simple statistical structure, standard asymptotically justified MLR-based tests are unreliable. In financial econometrics, exact tests have been proposed for a few specific hypotheses [Jobson and Korkie (Journal of Financial Economics, 1982), MacKinlay (Journal of Financial Economics, 1987), Gib-bons, Ross and Shanken (Econometrica, 1989), Zhou (Journal of Finance 1993)], most of which depend on normality. For the gaussian model, our tests correspond to Gibbons, Ross and Shanken’s mean-variance efficiency tests. In non-gaussian contexts, we reconsider mean-variance efficiency tests allowing for multivariate Student-t and gaussian mixture errors. Our framework allows to cast more evidence on whether the normality assumption is too restrictive when testing the CAPM. We also propose exact multivariate diagnostic checks (including tests for multivariate GARCH and mul-tivariate generalization of the well known variance ratio tests) and goodness of fit tests as well as a set estimate for the intervening nuisance parameters. Our results [over five-year subperiods] show the following: (i) multivariate normality is rejected in most subperiods, (ii) residual checks reveal no significant departures from the multivariate i.i.d. assumption, and (iii) mean-variance efficiency tests of the market portfolio is not rejected as frequently once it is allowed for the possibility of non-normal errors.