903 resultados para Concepts and models
Resumo:
This thesis entitled Reliability Modelling and Analysis in Discrete time Some Concepts and Models Useful in the Analysis of discrete life time data.The present study consists of five chapters. In Chapter II we take up the derivation of some general results useful in reliability modelling that involves two component mixtures. Expression for the failure rate, mean residual life and second moment of residual life of the mixture distributions in terms of the corresponding quantities in the component distributions are investigated. Some applications of these results are also pointed out. The role of the geometric,Waring and negative hypergeometric distributions as models of life lengths in the discrete time domain has been discussed already. While describing various reliability characteristics, it was found that they can be often considered as a class. The applicability of these models in single populations naturally extends to the case of populations composed of sub-populations making mixtures of these distributions worth investigating. Accordingly the general properties, various reliability characteristics and characterizations of these models are discussed in chapter III. Inference of parameters in mixture distribution is usually a difficult problem because the mass function of the mixture is a linear function of the component masses that makes manipulation of the likelihood equations, leastsquare function etc and the resulting computations.very difficult. We show that one of our characterizations help in inferring the parameters of the geometric mixture without involving computational hazards. As mentioned in the review of results in the previous sections, partial moments were not studied extensively in literature especially in the case of discrete distributions. Chapters IV and V deal with descending and ascending partial factorial moments. Apart from studying their properties, we prove characterizations of distributions by functional forms of partial moments and establish recurrence relations between successive moments for some well known families. It is further demonstrated that partial moments are equally efficient and convenient compared to many of the conventional tools to resolve practical problems in reliability modelling and analysis. The study concludes by indicating some new problems that surfaced during the course of the present investigation which could be the subject for a future work in this area.
Resumo:
This article is based on a study of a reform in the organisation of maternity services in the United Kingdom, which aimed towards developing a more woman-centred model of care. After decades of fragmentation and depersonalisation of care, associated with the shift of birth to a hospital setting, pressure by midwives and mothers prompted government review and a relatively radical turnaround in policy. However, the emergent model of care has been profoundly influenced by concepts and technologies of monitoring. The use of such technologies as ultrasound scans, electronic foetal monitoring and oxytocic augmentation of labour, generally supported by epidural anaesthesia for pain relief, have accompanied the development of a particular ecological model of birth – often called active management –, which is oriented towards the idea of an obstetric norm. Drawing on analysis of women’s narrative accounts of labour and birth, this article discusses the impact on women’s embodiment in birth, and the sources of information they use about the status of their own bodies, their labour and that of the child. It also illustrates how the impact on women’s experiences of birth may be mediated by a relational model of support, through the provision of caseload midwifery care.
Resumo:
Intraclass correlation (ICC) is an established tool to assess inter-rater reliability. In a seminal paper published in 1979, Shrout and Fleiss considered three statistical models for inter-rater reliability data with a balanced design. In their first two models, an infinite population of raters was considered, whereas in their third model, the raters in the sample were considered to be the whole population of raters. In the present paper, we show that the two distinct estimates of ICC developed for the first two models can both be applied to the third model and we discuss their different interpretations in this context.
Resumo:
Strategy is a contested concept. The generic literature is characterized by a diverse range of competing theories and alternative perspectives. Traditional models of the competitive strategy of construction firms have tended to focus on exogenous factors. In contrast, the resource-based view of strategic management emphasizes the importance of endogenous factors. The more recently espoused concept of dynamic capabilities extends consideration beyond static resources to focus on the ability of firms to reconfigure their operating routines to enable responses to changing environments. The relevance of the dynamics capabilities framework to the construction sector is investigated through an exploratory case study of a regional contractor. The focus on how firms continuously adapt to changing environments provides new insights into competitive strategy in the construction sector. Strong support is found for the importance of path dependency in shaping strategic choice. The case study further suggests that strategy is a collective endeavour enacted by a loosely defined group of individual actors. Dynamic capabilities are characterized by an empirical elusiveness and as such are best construed as situated practices embedded within a social and physical context.
Resumo:
The cost of maintenance makes up a large part of total energy costs in ruminants. Metabolizable energy (ME) requirement for maintenance (MEm) is the daily ME intake that exactly balances heat energy (HE). The net energy requirement for maintenance (NEm) is estimated subtracting MEm from the HE produced by the processing of the diet. Men cannot be directly measured experimentally and is estimated by measuring basal metabolism in fasted animals or by regression measuring the recovered energy in fed animals. MEm and NEm usually, but not always, are expressed in terms of BW0.75. However, this scaling factor is substantially empirical and its exponent is often inadequate, especially for growing animals. MEm estimated by different feeding systems (AFRC, CNCPS, CSIRO, INRA, NRC) were compared by using dairy cattle data. The comparison showed that these systems differ in the approaches used to estimate MEm and for its quantification. The CSIRO system estimated the highest MEm, mostly because it includes a correction factor to increase ME as the feeding level increases. Relative to CSIRO estimates, those of NRC, INRA, CNCPS, and AFRC were on average 0.92, 0.86, 0.84, and 0.78, respectively. MEm is affected by the previous nutritional history of the animals. This phenomenon is best predicted by dynamic models, of which several have been published in the last decades. They are based either on energy flows or on nutrient flows. Some of the different approaches used were described and discussed.
Resumo:
The report examines the relationship between day care institutions, schools and so called “parents unfamiliar to education” as well as the relationship between the institutions. With in Danish public and professional discourse concepts like parents unfamiliar to education are usually referring to environments, parents or families with either no or just very restricted experience of education except for the basic school (folkeskole). The “grand old man” of Danish educational research, Prof. Em. Erik Jørgen Hansen, defines the concept as follows: Parents who are distant from or not familiar with education, are parents without tradition of education and by that fact they are not able to contribute constructively in order to back up their own children during their education. Many teachers and pedagogues are not used to that term; they rather prefer concepts like “socially exposed” or “socially disadvantaged” parents or social classes or strata. The report does not only focus on parents who are not capable to support the school achievements of their children, since a low level of education is usually connected with social disadvantage. Such parents are often not capable of understanding and meeting the demands from side of the school when sending their children to school. They lack the competencies or the necessary competence of action. For the moment being much attention is done from side of the Ministries of Education and Social Affairs (recently renamed Ministry of Welfare) in order to create equal possibilities for all children. Many kinds of expertise (directions, counsels, researchers, etc.) have been more than eager to promote recommendations aiming at achieving the ambitious goal: 2015 95% of all young people should complement a full education (classes 10.-12.). Research results are pointing out the importance of increased participation of parents. In other word the agenda is set for ‘parents’ education’. It seems necessary to underline that Danish welfare policy has been changing rather radical. The classic model was an understanding of welfare as social assurance and/or as social distribution – based on social solidarity. The modern model looks like welfare as social service and/or social investment. This means that citizens are changing role – from user and/or citizen to consumer and/or investor. The Danish state is in correspondence with decisions taken by the government investing in a national future shaped by global competition. The new models of welfare – “service” and “investment” – imply severe changes in hitherto known concepts of family life, relationship between parents and children etc. As an example the investment model points at a new implementation of the relationship between social rights and the rights of freedom. The service model has demonstrated that weakness that the access to qualified services in the field of health or education is becoming more and more dependent of the private purchasing power. The weakness of the investment model is that it represents a sort of “The Winner takes it all” – since a political majority is enabled to make agendas in societal fields former protected by the tripartite power and the rights of freedom of the citizens. The outcome of the Danish development seems to be an establishment of a political governed public service industry which on one side are capable of competing on market conditions and on the other are able being governed by contracts. This represents a new form of close linking of politics, economy and professional work. Attempts of controlling education, pedagogy and thereby the population are not a recent invention. In European history we could easily point at several such experiments. The real news is the linking between political priorities and exercise of public activities by economic incentives. By defining visible goals for the public servants, by introducing measurement of achievements and effects, and by implementing a new wage policy depending on achievements and/or effects a new system of accountability is manufactured. The consequences are already perceptible. The government decides to do some special interventions concerning parents, children or youngsters, the public servants on municipality level are instructed to carry out their services by following a manual, and the parents are no longer protected by privacy. Protection of privacy and minority is no longer a valuable argumentation to prevent further interventions in people’s life (health, food, school, etc.). The citizens are becoming objects of investment, also implying that people are investing in their own health, education, and family. This means that investments in changes of life style and development of competences go hand in hand. The below mentioned programmes are conditioned by this shift.
Resumo:
Innovation studies have been interest of not only the scholars from various fields such as economics, management and sociology but also industrial practitioners and policy makers. In this vast and fruitful field, the theory of diffusion of innovations, which has been driven by a sociological approach, has played a vital role in our understanding of the mechanisms behind industrial change. In this paper, our aim is to give a state of art review of diffusion of innovation models in a structural and conceptual way with special reference to photovoltaic. We argue firstly, as an underlying background, how diffusion of innovations theory differs from other innovation studies. Secondly we give a brief taxonomical review of modelling methodologies together with comparative discussions. And finally we put the wealth of modelling in the context of photovoltaic diffusion and suggest some future directions.
Resumo:
Cognitive linguistics scholars argue that metaphor is fundamentally a conceptual process of mapping one domain of experience onto another domain. The study of metaphor in the context of Translation Studies has not, unfortunately, kept pace with the discoveries about the nature and role of metaphor in the cognitive sciences. This study aims primarily to fill part of this gap of knowledge. Specifically, the thesis is an attempt to explore some implications of the conceptual theory of metaphor for translation. Because the study of metaphor in translation is also based on views about the nature of translation, the thesis first presents a general overview of the discipline of Translation Studies, describing the major models of translation. The study (in Chapter Two) then discusses the major traditional theories of metaphor (comparison, substitution and interaction theories) and shows how the ideas of those theories were adopted in specific translation studies of metaphor. After that, the study presents a detailed account of the conceptual theory of metaphor and some hypothetical implications for the study of metaphor in translation from the perspective of cognitive linguistics. The data and methodology are presented in Chapter Four. A novel classification of conceptual metaphor is presented which distinguishes between different source domains of conceptual metaphors: physical, human-life and intertextual. It is suggested that each source domain places different demands on translators. The major sources of the data for this study are (1) the translations done by the Foreign Broadcasting Information Service (FBIS), which is a translation service of the Central Intelligence Agency (CIA) in the United Sates of America, of a number of speeches by the Iraqi president Saddam Hussein during the Gulf Crisis (1990-1991) and (2) official (governmental) Omani translations of National Day speeches of Sultan Qaboos bin Said of Oman.
Resumo:
This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.
Resumo:
Deep Neural Networks (DNNs) have revolutionized a wide range of applications beyond traditional machine learning and artificial intelligence fields, e.g., computer vision, healthcare, natural language processing and others. At the same time, edge devices have become central in our society, generating an unprecedented amount of data which could be used to train data-hungry models such as DNNs. However, the potentially sensitive or confidential nature of gathered data poses privacy concerns when storing and processing them in centralized locations. To this purpose, decentralized learning decouples model training from the need of directly accessing raw data, by alternating on-device training and periodic communications. The ability of distilling knowledge from decentralized data, however, comes at the cost of facing more challenging learning settings, such as coping with heterogeneous hardware and network connectivity, statistical diversity of data, and ensuring verifiable privacy guarantees. This Thesis proposes an extensive overview of decentralized learning literature, including a novel taxonomy and a detailed description of the most relevant system-level contributions in the related literature for privacy, communication efficiency, data and system heterogeneity, and poisoning defense. Next, this Thesis presents the design of an original solution to tackle communication efficiency and system heterogeneity, and empirically evaluates it on federated settings. For communication efficiency, an original method, specifically designed for Convolutional Neural Networks, is also described and evaluated against the state-of-the-art. Furthermore, this Thesis provides an in-depth review of recently proposed methods to tackle the performance degradation introduced by data heterogeneity, followed by empirical evaluations on challenging data distributions, highlighting strengths and possible weaknesses of the considered solutions. Finally, this Thesis presents a novel perspective on the usage of Knowledge Distillation as a mean for optimizing decentralized learning systems in settings characterized by data heterogeneity or system heterogeneity. Our vision on relevant future research directions close the manuscript.
Resumo:
This paper reviews a wide range of tools for comprehensive sustainability assessments at whole tourism destinations, covering socio-cultural, economic and environmental issues. It considers their strengths, weaknesses and site specific applicability. It is intended to facilitate their selection (and combination where necessary). Tools covered include Sustainability Indicators, Environmental Impact Assessment, Life Cycle Assessment, Environmental Audits, Ecological Footprints, Multi-Criteria Analysis and Adaptive Environmental Assessment. Guidelines for evaluating their suitability for specific sites and situations are given as well as examples of their use.
Resumo:
Data pertaining to the reputations, self-concepts and coping strategies of thirty-one secondary school Volatile Solvent Users (VSUs), forty-four ex-VSUs, and forty-eight non-VSUs in the Perth Metropolitan area of Western Australia were obtained using the High School Student Activity Questionnaire. Findings revealed that significant differences between current VSUs, ex-VSUs, and non-VSUs were more attributable to factors of reputation enhancement than to factors of either self-concept or coping strategies. Current VSUs identified themselves as both having and wanting to have a more non-confronting reputation, and as admiring drug-related activities significantly more than both ex-VSUs and non-VSUs. Two coping variables were also found to be significant indicating that females use more nonproductive coping strategies and external coping strategies than males. No interaction effects were identified. The implications for drug education and further research are discussed.
Resumo:
Some patients are no longer able to communicate effectively or even interact with the outside world in ways that most of us take for granted. In the most severe cases, tetraplegic or post-stroke patients are literally `locked in` their bodies, unable to exert any motor control after, for example, a spinal cord injury or a brainstem stroke, requiring alternative methods of communication and control. But we suggest that, in the near future, their brains may offer them a way out. Non-invasive electroencephalogram (EEG)-based brain-computer interfaces (BCD can be characterized by the technique used to measure brain activity and by the way that different brain signals are translated into commands that control an effector (e.g., controlling a computer cursor for word processing and accessing the internet). This review focuses on the basic concepts of EEG-based BC!, the main advances in communication, motor control restoration and the down-regulation of cortical activity, and the mirror neuron system (MNS) in the context of BCI. The latter appears to be relevant for clinical applications in the coming years, particularly for severely limited patients. Hypothetically, MNS could provide a robust way to map neural activity to behavior, representing the high-level information about goals and intentions of these patients. Non-invasive EEG-based BCIs allow brain-derived communication in patients with amyotrophic lateral sclerosis and motor control restoration in patients after spinal cord injury and stroke. Epilepsy and attention deficit and hyperactive disorder patients were able to down-regulate their cortical activity. Given the rapid progression of EEG-based BCI research over the last few years and the swift ascent of computer processing speeds and signal analysis techniques, we suggest that emerging ideas (e.g., MNS in the context of BC!) related to clinical neuro-rehabilitation of severely limited patients will generate viable clinical applications in the near future.
Concepts and determination of reference values for human biomonitoring of environmental contaminants
Resumo:
Human biomonitoring (HBM) of environmental contaminants plays an important role in estimating exposure and evaluating risk, and thus it has been increasingly applied in the environmental field. The results of HBM must be compared with reference values ( RV). The term ""reference values"" has always been related to the interpretation of clinical laboratory tests. For physicians, RV indicate ""normal values"" or ""limits of normal""; in turn, toxicologists prefer the terms ""background values"" or ""baseline values"" to refer to the presence of contaminants in biological fluids. This discrepancy leads to the discussion concerning which should be the population selected to determine RV. Whereas clinical chemistry employs an altered health state as the main exclusion criterion to select a reference population ( that is, a ""healthy"" population would be selected), in environmental toxicology the exclusion criterion is the abnormal exposure to xenobiotics. Therefore, the choice of population to determine RV is based on the very purpose of the RV to be determined. The present paper discusses the concepts and methodology used to determine RV for biomarkers of chemical environmental contaminants.