928 resultados para TEST CASE GENERATION
Resumo:
Drying is a major and challenging step in the pre-treatment of biomass for production of second generation synfuels for transport. The biomass feedstocks are mostly wet and need to be dried from 30 to 60 wt% moisture content to about 10-15 wt%. The present survey aims to define and evaluate a few of the most promising optimised concepts for biomass pre-treatment scheme in the production of second generation synfuels for transport. The most promising commercially available drying processes were reviewed, focusing on the applications, operational factors and emissions of dryers. The most common dryers applied now for biomass in bio-energy plants are direct rotary dryers, but the use of steam drying techniques is increasing. Steam drying systems enable the integration of the dryer to existing energy sources. In addition to integration, emissions and fire or explosion risks have to be considered when selecting a dryer for the plant. In steam drying there will be no gaseous emissions, but the aqueous effluents need often treatment. Concepts for biomass pre-treatment were defined for two different cases including a large-scale wood-based gasification synfuel production and a small-scale pyrolysis process based on wood chips and miscanthus bundles. For the first case a pneumatic conveying steam dryer was suggested. In the second case the flue gas will be used as drying medium in a direct or indirect rotary dryer.
Resumo:
We discuss aggregation of data from neuropsychological patients and the process of evaluating models using data from a series of patients. We argue that aggregation can be misleading but not aggregating can also result in information loss. The basis for combining data needs to be theoretically defined, and the particular method of aggregation depends on the theoretical question and characteristics of the data. We present examples, often drawn from our own research, to illustrate these points. We also argue that statistical models and formal methods of model selection are a useful way to test theoretical accounts using data from several patients in multiple-case studies or case series. Statistical models can often measure fit in a way that explicitly captures what a theory allows; the parameter values that result from model fitting often measure theoretically important dimensions and can lead to more constrained theories or new predictions; and model selection allows the strength of evidence for models to be quantified without forcing this into the artificial binary choice that characterizes hypothesis testing methods. Methods that aggregate and then formally model patient data, however, are not automatically preferred to other methods. Which method is preferred depends on the question to be addressed, characteristics of the data, and practical issues like availability of suitable patients, but case series, multiple-case studies, single-case studies, statistical models, and process models should be complementary methods when guided by theory development.
Resumo:
Whilst target costing and strategic management accounting (SMA) continue to be of considerable interest to academic accountants, both suffer from a relative dearth of empirically based research. Simultaneously, the subject of economic value added (EVA) has also been the subject of little research at the level of the individual firm.The aim of this paper is to contribute to both the management accounting and value based management literatures by analysing how one major European based MNC introduced EVA into its target costing system. The case raises important questions about both the feasibility of cascading EVA down to product level and the compatibility of customer facing versus shareholder focused systems of performance management. We provide preliminary evidence that target costing can be used to align both of these perspectives, and when combined with other SMA techniques it can serve as " the bridge connecting strategy formulation with strategy execution and profit generation" ( Ansari et al., 2007, p. 512). © 2012 Elsevier Ltd.
Resumo:
Purpose – The purpose of this paper is to evaluate how a UK business school is addressing the Government's skills strategy through its Graduate Certificate in Management, and to identify good practice and development needs and to clarify how the Graduate Certificate is adapting to the needs of Generation X and Millennial students. The paper also aims to test Kolb and Kolb's experiential learning theory (ELT) in a business school setting. Design/methodology/approach – A case study methodology was adopted. In order to get a cross-section of views and triangulate the data, three focus groups were held, supported by reading documentation about the programme of study. Findings – The skills strategy is not just an ambition for some business schools, but is already part of the curriculum. Generation X and the Millennials have more in common with the positive attitudes associated with older generations than stereotyped views might allow. ELT provides a useful theoretical framework for evaluating a programme of study and student attitudes. Research limitations/implications – The research findings from one case study are reported, limiting the generalisability of the study. Practical implications – Good practice and development needs are identified which support the implementation of the Government's skills strategy and address employer concerns about student skills. Originality/value – New empirical data are reported which supports the use of ELT in evaluating programmes of study and student attitudes to work.
Resumo:
Although considerable effort has been invested in the measurement of banking efficiency using Data Envelopment Analysis, hardly any empirical research has focused on comparison of banks in Gulf States Countries This paper employs data on Gulf States banking sector for the period 2000-2002 to develop efficiency scores and rankings for both Islamic and conventional banks. We then investigate the productivity change using Malmquist Index and decompose the productivity into technical change and efficiency change. Further, hypothesis testing and statistical precision in the context of nonparametric efficiency and productivity measurement have been used. Specially, cross-country analysis of efficiency and comparisons of efficiencies between Islamic banks and conventional banks have been investigated using Mann-Whitney test.
Resumo:
TEST is a novel taxonomy of knowledge representations based on three distinct hierarchically organized representational features: Tropism, Embodiment, and Situatedness. Tropic representational features reflect constraints of the physical world on the agent's ability to form, reactivate, and enrich embodied (i.e., resulting from the agent's bodily constraints) conceptual representations embedded in situated contexts. The proposed hierarchy entails that representations can, in principle, have tropic features without necessarily having situated and/or embodied features. On the other hand, representations that are situated and/or embodied are likely to be simultaneously tropic. Hence, although we propose tropism as the most general term, the hierarchical relationship between embodiment and situatedness is more on a par, such that the dominance of one component over the other relies on the distinction between offline storage versus online generation as well as on representation-specific properties. © 2013 Cognitive Science Society, Inc.
Resumo:
We demonstrate a compact all-room-temperature picosecond laser source broadly tunable in the visible spectral region between 600 nm and 627 nm. The tunable radiation is obtained by frequency-doubling of a tunable quantum-dot external-cavity mode-locked laser in a periodically-poled KTP multimode waveguide. In this case, utilization of a significant difference in the effective refractive indices of the high- and low-order modes enables to match the period of poling in a very broad wavelength range. The maximum achieved second harmonic output peak power is 3.25 mW at 613 nm for 71.43 mW of launched pump peak power at 1226 nm, resulting in conversion efficiency of 4.55%. © 2013 Copyright SPIE.
Resumo:
Astrocytes are essential for neuronal function and survival, so both cell types were included in a human neurotoxicity test-system to assess the protective effects of astrocytes on neurons, compared with a culture of neurons alone. The human NT2.D1 cell line was differentiated to form either a co-culture of post-mitotic NT2.N neuronal (TUJ1, NF68 and NSE positive) and NT2.A astrocytic (GFAP positive) cells (∼2:1 NT2.A:NT2.N), or an NT2.N mono-culture. Cultures were exposed to human toxins, for 4 h at sub-cytotoxic concentrations, in order to compare levels of compromised cell function and thus evidence of an astrocytic protective effect. Functional endpoints examined included assays for cellular energy (ATP) and glutathione (GSH) levels, generation of hydrogen peroxide (H2O2) and caspase-3 activation. Generally, the NT2.N/A co-culture was more resistant to toxicity, maintaining superior ATP and GSH levels and sustaining smaller significant increases in H2O2 levels compared with neurons alone. However, the pure neuronal culture showed a significantly lower level of caspase activation. These data suggest that besides their support for neurons through maintenance of ATP and GSH and control of H2O2 levels, following exposure to some substances, astrocytes may promote an apoptotic mode of cell death. Thus, it appears the use of astrocytes in an in vitro predictive neurotoxicity test-system may be more relevant to human CNS structure and function than neuronal cells alone. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
Hospitals everywhere are integrating health data using electronic health record (EHR) systems, and disparate and multimedia patient data can be input by different caregivers at different locations as encapsulated patient profiles. Healthcare institutions are also using the flexibility and speed of wireless computing to improve quality and reduce costs. We are developing a mobile application that allows doctors to efficiently record and access complete and accurate real-time patient information. The system integrates medical imagery with textual patient profiles as well as expert interactions by healthcare personnel using knowledge management and case-based reasoning techniques. The application can assist other caregivers in searching large repositories of previous patient cases. Patients' symptoms can be input to a portable device and the application can quickly retrieve similar profiles which can be used to support effective diagnoses and prognoses by comparing symptoms, treatments, diagnosis, test results and other patient information. © 2007 Sage Publications.
Resumo:
PURPOSE. To assess systemic and ocular vascular reactivity in response to warm and cold provocation in untreated patients with primary open-angle glaucoma and normal control subjects. METHODS. Twenty-four patients with primary open-angle glaucoma and 22 normal control subjects were subjected to a modified cold pressor test involving immersion of the right hand in 40°C warm water followed by 4°C cold water exposure, and finger and ocular blood flow were assessed by means of peripheral laser Doppler flowmetry and retinal flowmetry, respectively. Finger and body temperature as well as intraocular pressure, systemic blood pressure, systemic pulse pressure, heart rate, and ocular perfusion pressure were also monitored. RESULTS. The patients with glaucoma demonstrated an increase in diastolic blood pressure (P = 0.023), heart rate (P = 0.010), and mean ocular perfusion pressure (P = 0.039) during immersion of the tested hand in 40°C water. During cold provocation, the patients demonstrated a significant decrease in finger (P = 0.0003) and ocular blood flow (the parameter velocity measured at the temporal neuroretinal rim area; P = 0.021). Normal subjects did not demonstrate any blood flow or finger temperature changes during immersion of the tested hand in 40°C water (P > 0.05); however, they exhibited increases in systolic blood pressure (P = 0.034) and pulse pressure (P = 0.0009) and a decrease in finger blood flow (P = 0.0001) during cold provocation. In normal subjects, the ocular blood flow was unchanged during high- and low-temperature challenge. CONCLUSIONS. Cold provocation elicits a different blood pressure, and ocular blood flow response in patients with primary open-angle glaucoma compared with control subjects. These findings suggest a systemic autonomic failure and ocular vascular dysregulation in POAG patients.
Resumo:
This article introduces a small setting case study about the benefits of using TSPi in a software project. An adapted process from the current process based on the TSPi was defined. The pilot project had schedule and budget constraints. The process began by gathering historical data from previous projects in order to get a measurement repository. The project was launched with the following goals: increase the productivity, reduce the test time and improve the product quality. Finally, the results were analysed and the goals were verified.
Resumo:
This paper describes the followed methodology to automatically generate titles for a corpus of questions that belong to sociological opinion polls. Titles for questions have a twofold function: (1) they are the input of user searches and (2) they inform about the whole contents of the question and possible answer options. Thus, generation of titles can be considered as a case of automatic summarization. However, the fact that summarization had to be performed over very short texts together with the aforementioned quality conditions imposed on new generated titles led the authors to follow knowledge-rich and domain-dependent strategies for summarization, disregarding the more frequent extractive techniques for summarization.
Resumo:
The best results in the application of computer science systems to automatic translation are obtained in word processing when texts pertain to specific thematic areas, with structures well defined and a concise and limited lexicon. In this article we present a plan of systematic work for the analysis and generation of language applied to the field of pharmaceutical leaflet, a type of document characterized by format rigidity and precision in the use of lexicon. We propose a solution based in the use of one interlingua as language pivot between source and target languages; we are considering Spanish and Arab languages in this case of application.
Resumo:
Today, focus is shifting to creation of bio-energy, biofuel and bioproducts from cellulosic biomass derived from various sources, including existing and new crops and their residues, trees and forest residues, and municipal or industrial wastes. At present, biomass co-firing in modern coal power plants with efficiencies up to 45% is the most cost-effective biomass use for power generation. Due to feedstock availability issues, dedicated biomass plants for combined heat and power (CHP), are typically of smaller size and lower electrical efficiency compared to coal plants. The financial model discussed in the chapter is suitable for all countries both in the West and in the developing world. From the economic analysis given in the chapter it can be concluded that intermediate pyrolysis technology proves to be very effective in terms of product qualities of the oil produced and also the return on investment is around 4 to 5 years.
Resumo:
Bio energy is a renewable energy and a solution to the depleting fossil fuels. Bio energy such as heat, power and bio fuel is generated by conversion technologies using biomass for example domestic waste, root crops, forest residue and animal slurry. Pyrolysis, anaerobic digestion and combined heat and power engine are some examples of the technologies. Depending on the nature of a biomass, it can be treated with various technologies giving out some products, which can be further treated with other technologies and eventually converted into the final products as bio energy. The pathway followed by the biomass, technologies, intermediate products and bio energy in the conversion process is referred to as bio energy pathway. Identification of appropriate pathways optimizes the conversion process. Although there are various approaches to create or generate the pathways, there is still a need for a semantic approach to generate the pathways, which allow checking the consistency of the knowledge, and to share and extend the knowledge efficiently. This paper presents an ontology-based approach to automatic generation of the pathways for biomass to bio energy conversion, which exploits the definition and hierarchical structure of the biomass and technologies, their relationship and associated properties, and infers appropriate pathways. A case study has been carried out in a real-life scenario, the bio energy project for the North West of Europe (Bioen NW), which showed promising results.