904 resultados para Process-dissociation Framework
Resumo:
Project Report: The PHAR-IN ("Competences for industrial pharmacy practice in biotechnology") looked at whether there is a difference in how industrial employees and academics rank competences for practice in the biotechnological industry. A small expert panel consisting of the authors of this paper produced a biotechnology competence framework by drawing up an initial list of competences then ranking them in importance using a three-stage Delphi process. The framework was next evaluated and validated by a large expert panel of academics (n = 37) and industrial employees (n = 154). Results show that priorities for industrial employees and academics were similar. The competences for biotechnology practice that received the highest scores were mainly in: . "Research and Development", . "Upstream" and "Downstream" Processing', " . "Product development and formulation", " . "Aseptic processing", ."Analytical methodology", . "Product stability", and . "Regulation". The main area of disagreement was in the category "Ethics and drug safety" where academics ranked competences higher than did industrial employees.
Resumo:
Symbolic execution is a powerful program analysis technique, but it is very challenging to apply to programs built using event-driven frameworks, such as Android. The main reason is that the framework code itself is too complex to symbolically execute. The standard solution is to manually create a framework model that is simpler and more amenable to symbolic execution. However, developing and maintaining such a model by hand is difficult and error-prone. We claim that we can leverage program synthesis to introduce a high-degree of automation to the process of framework modeling. To support this thesis, we present three pieces of work. First, we introduced SymDroid, a symbolic executor for Android. While Android apps are written in Java, they are compiled to Dalvik bytecode format. Instead of analyzing an app’s Java source, which may not be available, or decompiling from Dalvik back to Java, which requires significant engineering effort and introduces yet another source of potential bugs in an analysis, SymDroid works directly on Dalvik bytecode. Second, we introduced Pasket, a new system that takes a first step toward automatically generating Java framework models to support symbolic execution. Pasket takes as input the framework API and tutorial programs that exercise the framework. From these artifacts and Pasket's internal knowledge of design patterns, Pasket synthesizes an executable framework model by instantiating design patterns, such that the behavior of a synthesized model on the tutorial programs matches that of the original framework. Lastly, in order to scale program synthesis to framework models, we devised adaptive concretization, a novel program synthesis algorithm that combines the best of the two major synthesis strategies: symbolic search, i.e., using SAT or SMT solvers, and explicit search, e.g., stochastic enumeration of possible solutions. Adaptive concretization parallelizes multiple sub-synthesis problems by partially concretizing highly influential unknowns in the original synthesis problem. Thanks to adaptive concretization, Pasket can generate a large-scale model, e.g., thousands lines of code. In addition, we have used an Android model synthesized by Pasket and found that the model is sufficient to allow SymDroid to execute a range of apps.
Resumo:
Frequency, recency, and type of prior exposure to very low-and high-frequency words were manipulated in a 3-phase (i.e., familiarization training, study, and test) design. Increasing the frequency with which a definition for a very low-frequency word was provided during familiarization facilitated the word's recognition in both yes-no (Experiment 1) and forced-choice paradigms (Experiment 2). Recognition of very low-frequency words not accompanied by a definition during familiarization first increased, then decreased as familiarization frequency increased (Experiment I). Reasons for these differences were investigated in Experiment 3 using judgments of recency and frequency. Results suggested that prior familiarization of a very low-frequency word with its definition may allow a more adequate episodic representation of the word to be formed during a subsequent study trial. Theoretical implications of these results for current models of memory are discussed.
Resumo:
Event-related potentials (ERPs) were recorded while subjects made old/new recognition judgments on new unstudied words and old words which had been presented at study either once ('weak') or three times ('strong'). The probability of an 'old' response was significantly higher for strong than weak words and significantly higher for weak than new words. Comparisons were made initially between ERPs to new, weak and strong words, and subsequently between ERPs associated with six strength-by-response conditions. The N400 component was found to be modulated by memory trace strength in a graded manner. Its amplitude was most negative in new word ERPs and most positive in strong word ERPs. This 'N400 strength effect' was largest at the left parietal electrode (in ear-referenced ERPs). The amplitude of the late positive complex (LPC) effect was sensitive to decision accuracy (and perhaps confidence). Its amplitude was larger in ERPs evoked by words attracting correct versus incorrect recognition decisions. The LPC effect had a left > right, centro-parietal scalp topography (in ear-referenced ERPs). Hence, whereas, the majority of previous ERP studies of episodic recognition have interpreted results from the perspective of dual-process models, we provide alternative interpretations of N400 and LPC old/new effects in terms of memory strength and decisional factor(s). (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Item noise models of recognition assert that interference at retrieval is generated by the words from the study list. Context noise models of recognition assert that interference at retrieval is generated by the contexts in which the test word has appeared. The authors introduce the bind cue decide model of episodic memory, a Bayesian context noise model, and demonstrate how it can account for data from the item noise and dual-processing approaches to recognition memory. From the item noise perspective, list strength and list length effects, the mirror effect for word frequency and concreteness, and the effects of the similarity of other words in a list are considered. From the dual-processing perspective, process dissociation data on the effects of length. temporal separation of lists, strength, and diagnosticity of context are examined. The authors conclude that the context noise approach to recognition is a viable alternative to existing approaches. (PsycINFO Database Record (c) 2008 APA, all rights reserved)
Resumo:
O artigo discute as caracter??sticas da regula????o dos servi??os p??blicos no setor de saneamento, apresentando comparativamente as formas de presta????o adotadas por Fran??a, Inglaterra e Brasil ??? e como esses pa??ses disciplinam a participa????o privada no setor. Enquanto a Fran??a segue um padr??o hist??rico de regula????o marcado pelo protagonismo das autoridades locais, utilizando contratos como instrumento por excel??ncia de disciplina dos servi??os, a Inglaterra introduziu, atrav??s de ambicioso processo de privatiza????o, marco regulat??rio em que ag??ncias do Governo Central s??o os atores principais. Ainda sem um modelo definido, o Brasil, atrav??s de inova????es legais, enfrenta os desafios de atrair investimentos privados para um setor sob a presen??a de conflitos federativos entre estados e munic??pios acerca da titularidade dos referidos servi??os p??blicos.
Resumo:
Dissertação apresentada à Escola Superior de Educação de Lisboa para obtenção de grau de mestre em Ciências da Educação - Especialização Supervisão em Educação
Resumo:
OBJECTIVE Develop an index to evaluate the maternal and neonatal hospital care of the Brazilian Unified Health System.METHODS This descriptive cross-sectional study of national scope was based on the structure-process-outcome framework proposed by Donabedian and on comprehensive health care. Data from the Hospital Information System and the National Registry of Health Establishments were used. The maternal and neonatal network of Brazilian Unified Health System consisted of 3,400 hospitals that performed at least 12 deliveries in 2009 or whose number of deliveries represented 10.0% or more of the total admissions in 2009. Relevance and reliability were defined as criteria for the selection of variables. Simple and composite indicators and the index of completeness were constructed and evaluated, and the distribution of maternal and neonatal hospital care was assessed in different regions of the country.RESULTS A total of 40 variables were selected, from which 27 single indicators, five composite indicators, and the index of completeness of care were built. Composite indicators were constructed by grouping simple indicators and included the following variables: hospital size, level of complexity, delivery care practice, recommended hospital practice, and epidemiological practice. The index of completeness of care grouped the five variables and classified them in ascending order, thereby yielding five levels of completeness of maternal and neonatal hospital care: very low, low, intermediate, high, and very high. The hospital network was predominantly of small size and low complexity, with inadequate child delivery care and poor development of recommended and epidemiological practices. The index showed that more than 80.0% hospitals had a low index of completeness of care and that most qualified heath care services were concentrated in the more developed regions of the country.CONCLUSIONS The index of completeness proved to be of great value for monitoring the maternal and neonatal hospital care of Brazilian Unified Health System and indicated that the quality of health care was unsatisfactory. However, its application does not replace specific evaluations.
Resumo:
Dissertação apresentada como requisito parcial para obtenção do grau de Mestre em Estatística e Gestão de Informação
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Química e Bioquímica
Resumo:
The historically-reactive approach to identifying safety problems and mitigating them involves selecting black spots or hot spots by ranking locations based on crash frequency and severity. The approach focuses mainly on the corridor level without taking the exposure rate (vehicle miles traveled) and socio-demographics information of the study area, which are very important in the transportation planning process, into consideration. A larger study analysis unit at the Transportation Analysis Zone (TAZ) level or the network planning level should be used to address the needs of development of the community in the future and incorporate safety into the long-range transportation planning process. In this study, existing planning tools (such as the PLANSAFE models presented in NCHRP Report 546) were evaluated for forecasting safety in small and medium-sized communities, particularly as related to changes in socio-demographics characteristics, traffic demand, road network, and countermeasures. The research also evaluated the applicability of the Empirical Bayes (EB) method to network-level analysis. In addition, application of the United States Road Assessment Program (usRAP) protocols at the local urban road network level was investigated. This research evaluated the applicability of these three methods for the City of Ames, Iowa. The outcome of this research is a systematic process and framework for considering road safety issues explicitly in the small and medium-sized community transportation planning process and for quantifying the safety impacts of new developments and policy programs. More specifically, quantitative safety may be incorporated into the planning process, through effective visualization and increased awareness of safety issues (usRAP), the identification of high-risk locations with potential for improvement, (usRAP maps and EB), countermeasures for high-risk locations (EB before and after study and PLANSAFE), and socio-economic and demographic induced changes at the planning-level (PLANSAFE).
Resumo:
The goal of the study is to locate a site appropriate for a 170 offender unit, based on an understanding of current needs, existing and proposed procedure and the longevity of existing facility use. The study is based on master-plan process. A framework on which multiple stages of infrastructure development can be based is initiated through the systemic study of infrastructure condition, infrastructure needs and treatment objectives.
Resumo:
Web services from different partners can be combined to applications that realize a more complex business goal. Such applications built as Web service compositions define how interactions between Web services take place in order to implement the business logic. Web service compositions not only have to provide the desired functionality but also have to comply with certain Quality of Service (QoS) levels. Maximizing the users' satisfaction, also reflected as Quality of Experience (QoE), is a primary goal to be achieved in a Service-Oriented Architecture (SOA). Unfortunately, in a dynamic environment like SOA unforeseen situations might appear like services not being available or not responding in the desired time frame. In such situations, appropriate actions need to be triggered in order to avoid the violation of QoS and QoE constraints. In this thesis, proper solutions are developed to manage Web services and Web service compositions with regard to QoS and QoE requirements. The Business Process Rules Language (BPRules) was developed to manage Web service compositions when undesired QoS or QoE values are detected. BPRules provides a rich set of management actions that may be triggered for controlling the service composition and for improving its quality behavior. Regarding the quality properties, BPRules allows to distinguish between the QoS values as they are promised by the service providers, QoE values that were assigned by end-users, the monitored QoS as measured by our BPR framework, and the predicted QoS and QoE values. BPRules facilitates the specification of certain user groups characterized by different context properties and allows triggering a personalized, context-aware service selection tailored for the specified user groups. In a service market where a multitude of services with the same functionality and different quality values are available, the right services need to be selected for realizing the service composition. We developed new and efficient heuristic algorithms that are applied to choose high quality services for the composition. BPRules offers the possibility to integrate multiple service selection algorithms. The selection algorithms are applicable also for non-linear objective functions and constraints. The BPR framework includes new approaches for context-aware service selection and quality property predictions. We consider the location information of users and services as context dimension for the prediction of response time and throughput. The BPR framework combines all new features and contributions to a comprehensive management solution. Furthermore, it facilitates flexible monitoring of QoS properties without having to modify the description of the service composition. We show how the different modules of the BPR framework work together in order to execute the management rules. We evaluate how our selection algorithms outperform a genetic algorithm from related research. The evaluation reveals how context data can be used for a personalized prediction of response time and throughput.
Resumo:
Business analysis has developed since the early 1990s as an IS discipline that is concerned with understanding business problems, defining requirements and evaluating relevant solutions. However, this discipline has had limited recognition within the academic community and little research has been conducted into the practices and standards employed by business analysts. This paper reports on a study into business analysis that considered the activities conducted and the outcomes experienced on IS projects. Senior business analysts were interviewed in order to gain insights into the business analyst role and the techniques and approaches applied when conducting this work. The Context, Content, Process, Outcomes framework was adopted as a basis for developing the interview questions. The data collected was analysed using the template analysis technique and the template was based upon this framework. Additional themes concerning aspects of business analysis that may contribute to IS success emerged during data analysis. These included the key business analysis activities and the skills business analysts require to perform these activities. The organisational attitude was also identified as a key factor in enabling the use and contribution of business analysis.
Resumo:
When modeling real-world decision-theoretic planning problems in the Markov Decision Process (MDP) framework, it is often impossible to obtain a completely accurate estimate of transition probabilities. For example, natural uncertainty arises in the transition specification due to elicitation of MOP transition models from an expert or estimation from data, or non-stationary transition distributions arising from insufficient state knowledge. In the interest of obtaining the most robust policy under transition uncertainty, the Markov Decision Process with Imprecise Transition Probabilities (MDP-IPs) has been introduced to model such scenarios. Unfortunately, while various solution algorithms exist for MDP-IPs, they often require external calls to optimization routines and thus can be extremely time-consuming in practice. To address this deficiency, we introduce the factored MDP-IP and propose efficient dynamic programming methods to exploit its structure. Noting that the key computational bottleneck in the solution of factored MDP-IPs is the need to repeatedly solve nonlinear constrained optimization problems, we show how to target approximation techniques to drastically reduce the computational overhead of the nonlinear solver while producing bounded, approximately optimal solutions. Our results show up to two orders of magnitude speedup in comparison to traditional ""flat"" dynamic programming approaches and up to an order of magnitude speedup over the extension of factored MDP approximate value iteration techniques to MDP-IPs while producing the lowest error of any approximation algorithm evaluated. (C) 2011 Elsevier B.V. All rights reserved.