810 resultados para 380109 Psychological Methodology, Design and Analysis
Resumo:
BACKGROUND: The combined effects of vanillin and syringaldehyde on xylitol production by Candida guilliermondii using response surface methodology (RSM) have been studied. A 2(2) full-factorial central composite design was employed for experimental design and analysis of the results. RESULTS: Maximum xylitol productivities (Q(p) = 0.74 g L(-1) h(-1)) and yields (Y(P/S) = 0.81 g g(-1)) can be attained by adding only vanillin at 2.0 g L(-1) to the fermentation medium. These data were closely correlated with the experimental results obtained (0.69 +/- 0.04 g L(-1) h(-1) and 0.77 +/- 0.01 g g(-1)) indicating a good agreement with the predicted value. C. guilliermondii was able to convert vanillin completely after 24 h of fermentation with 94% yield of vanillyl alcohol. CONCLUSIONS: The bioconversion of xylose into xylitol by C. guilliermondii is strongly dependent on the combination of aldehydes and phenolics in the fermentation medium. Vanillin is a source of phenolic compound able to improve xylitol production by yeast. The conversion of vanillin to alcohol vanilyl reveals the potential of this yeast for medium detoxification. (C) 2009 Society of Chemical Industry
Resumo:
The performance optimisation of overhead conductors depends on the systematic investigation of the fretting fatigue mechanisms in the conductor/clamping system. As a consequence, a fretting fatigue rig was designed and a limited range of fatigue tests was carried out at the middle high cycle fatigue regime in order to access an exploratory S-N curve for a Grosbeak conductor, which was mounted on a mono-articulated aluminium clamping system. Subsequent to these preliminary fatigue tests, the components of the conductor/clamping system, such as ACSR conductor, upper and lower clamps, bolt and nuts, were subjected to a failure analysis procedure in order to investigate the metallurgical free variables interfering on the fatigue test results, aiming at the optimisation of the testing reproducibility. The results indicated that the rupture of the planar fracture surfaces observed in the external At strands of the conductor tested under lower bending amplitude (0.9 mm) occurred by fatigue cracking (I mm deep), followed by shear overload. The V-type fracture surfaces observed in some At strands of the conductor tested under higher bending amplitude (1.3 mm) were also produced by fatigue cracking (approximately 400 mu m deep), followed by shear overload. Shear overload fracture (45 degrees fracture surface) was also observed on the remaining At wires of the conductor tested under higher bending amplitude (1.3 mm). Additionally, the upper and lower Al-cast clamps presented microstructure-sensitive cracking, which was folowed by particle detachment and formation of abrasive debris on the clamp/conductor tribo-interface, promoting even further the fretting mechanism. The detrimental formation of abrasive debris might be inhibited by the selection of a more suitable class of as-cast At alloy for the production of clamps. Finally, the bolt/nut system showed intense degradation of the carbon steel nut (fabricated in ferritic-pearlitic carbon steel, featuring machined threads with 190 HV), with intense plastic deformation and loss of material. Proper selection of both the bolt and nut materials and the finishing processing might prevent the loss in the clamping pressure during the fretting testing. It is important to control the specification of these components (clamps, bolt and nuts) prior to the start of large scale fretting fatigue testing of the overhead conductors in order to increase the reproducibility of this assessment. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A combination of modelling and analysis techniques was used to design a six component force balance. The balance was designed specifically for the measurement of impulsive aerodynamic forces and moments characteristic of hypervelocity shock tunnel testing using the stress wave force measurement technique. Aerodynamic modelling was used to estimate the magnitude and distribution of forces and finite element modelling to determine the mechanical response of proposed balance designs. Simulation of balance performance was based on aerodynamic loads and mechanical responses using convolution techniques. Deconvolution was then used to assess balance performance and to guide further design modifications leading to the final balance design. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Objective: To describe and analyse the study design and manuscript deficiencies in original research articles submitted to Emergency Medicine. Methods: This was a retrospective, analytical study. Articles were enrolled if the reports of the Section Editor and two reviewers were available. Data were extracted from these reports only. Outcome measures were the mean number and nature of the deficiencies and the mean reviewers’ assessment score. Results: Fifty-seven articles were evaluated (28 accepted for publication, 19 rejected, 10 pending revision). The mean (± SD) number of deficiencies was 18.1 ± 6.9, 16.4 ± 6.5 and 18.4 ± 6.7 for all articles, articles accepted for publication and articles rejected, respectively (P = 0.31 between accepted and rejected articles). The mean assessment scores (0–10) were 5.5 ± 1.5, 5.9 ± 1.5 and 4.7 ± 1.4 for all articles, articles accepted for publication and articles rejected, respectively. Accepted articles had a significantly higher assessment score than rejected articles (P = 0.006). For each group, there was a negative correlation between the number of deficiencies and the mean assessment score (P > 0.05). Significantly more rejected articles ‘… did not further our knowledge’ (P = 0.0014) and ‘… did not describe background information adequately’ (P = 0.049). Many rejected articles had ‘… findings that were not clinically or socially significant’ (P = 0.07). Common deficiencies among all articles included ambiguity of the methods (77%) and results (68%), conclusions not warranted by the data (72%), poor referencing (56%), inadequate study design description (51%), unclear tables (49%), an overly long discussion (49%), limitations of the study not described (51%), inadequate definition of terms (49%) and subject selection bias (40%). Conclusions: Researchers should undertake studies that are likely to further our knowledge and be clinically or socially significant. Deficiencies in manuscript preparation are more frequent than mistakes in study design and execution. Specific training or assistance in manuscript preparation is indicated.
Resumo:
MicroRNAs (miRNA) are recognized posttranscriptional gene repressors involved in the control of almost every biological process. Allelic variants in these regions may be an important source of phenotypic diversity and contribute to disease susceptibility. We analyzed the genomic organization of 325 human miRNAs (release 7.1, miRBase) to construct a panel of 768 single-nucleotide polymorphisms (SNPs) covering approximately 1 Mb of genomic DNA, including 131 isolated miRNAs (40%) and 194 miRNAs arranged in 48 miRNA clusters, as well as their 5-kb flanking regions. Of these miRNAs, 37% were inside known protein-coding genes, which were significantly associated with biological functions regarding neurological, psychological or nutritional disorders. SNP coverage analysis revealed a lower SNP density in miRNAs compared with the average of the genome, with only 24 SNPs located in the 325 miRNAs studied. Further genotyping of 340 unrelated Spanish individuals showed that more than half of the SNPs in miRNAs were either rare or monomorphic, in agreement with the reported selective constraint on human miRNAs. A comparison of the minor allele frequencies between Spanish and HapMap population samples confirmed the applicability of this SNP panel to the study of complex disorders among the Spanish population, and revealed two miRNA regions, hsa-mir-26a-2 in the CTDSP2 gene and hsa-mir-128-1 in the R3HDM1 gene, showing geographical allelic frequency variation among the four HapMap populations, probably because of differences in natural selection. The designed miRNA SNP panel could help to identify still hidden links between miRNAs and human disease.
Resumo:
BACKGROUND AND STUDY AIMS: Appropriate use of colonoscopy is a key component of quality management in gastrointestinal endoscopy. In an update of a 1998 publication, the 2008 European Panel on the Appropriateness of Gastrointestinal Endoscopy (EPAGE II) defined appropriateness criteria for various colonoscopy indications. This introductory paper therefore deals with methodology, general appropriateness, and a review of colonoscopy complications. METHODS:The RAND/UCLA Appropriateness Method was used to evaluate the appropriateness of various diagnostic colonoscopy indications, with 14 multidisciplinary experts using a scale from 1 (extremely inappropriate) to 9 (extremely appropriate). Evidence reported in a comprehensive updated literature review was used for these decisions. Consolidation of the ratings into three appropriateness categories (appropriate, uncertain, inappropriate) was based on the median and the heterogeneity of the votes. The experts then met to discuss areas of disagreement in the light of existing evidence, followed by a second rating round, with a subsequent third voting round on necessity criteria, using much more stringent criteria (i. e. colonoscopy is deemed mandatory). RESULTS: Overall, 463 indications were rated, with 55 %, 16 % and 29 % of them being judged appropriate, uncertain and inappropriate, respectively. Perforation and hemorrhage rates, as reported in 39 studies, were in general < 0.1 % and < 0.3 %, respectively CONCLUSIONS: The updated EPAGE II criteria constitute an aid to clinical decision-making but should in no way replace individual judgment. Detailed panel results are freely available on the internet (www.epage.ch) and will thus constitute a reference source of information for clinicians.
Resumo:
State Highway Departments and local street and road agencies are currently faced with aging highway systems and a need to extend the life of some of the pavements. The agency engineer should have the opportunity to explore the use of multiple surface types in the selection of a preferred rehabilitation strategy. This study was designed to look at the portland cement concrete overlay alternative and especially the design of overlays for existing composite (portland cement and asphaltic cement concrete) pavements. Existing design procedures for portland cement concrete overlays deal primarily with an existing asphaltic concrete pavement with an underlying granular base or stabilized base. This study reviewed those design methods and moved to the development of a design for overlays of composite pavements. It deals directly with existing portland cement concrete pavements that have been overlaid with successive asphaltic concrete overlays and are in need of another overlay due to poor performance of the existing surface. The results of this study provide the engineer with a way to use existing deflection technology coupled with materials testing and a combination of existing overlay design methods to determine the design thickness of the portland cement concrete overlay. The design methodology provides guidance for the engineer, from the evaluation of the existing pavement condition through the construction of the overlay. It also provides a structural analysis of various joint and widening patterns on the performance of such designs. This work provides the engineer with a portland cement concrete overlay solution to composite pavements or conventional asphaltic concrete pavements that are in need of surface rehabilitation.
Resumo:
BACKGROUND AND PURPOSE: Stroke registries are valuable tools for obtaining information about stroke epidemiology and management. The Acute STroke Registry and Analysis of Lausanne (ASTRAL) prospectively collects epidemiological, clinical, laboratory and multimodal brain imaging data of acute ischemic stroke patients in the Centre Hospitalier Universitaire Vaudois (CHUV). Here, we provide design and methods used to create ASTRAL and present baseline data of our patients (2003 to 2008). METHODS: All consecutive patients admitted to CHUV between January 1, 2003 and December 31, 2008 with acute ischemic stroke within 24 hours of symptom onset were included in ASTRAL. Patients arriving beyond 24 hours, with transient ischemic attack, intracerebral hemorrhage, subarachnoidal hemorrhage, or cerebral sinus venous thrombosis, were excluded. Recurrent ischemic strokes were registered as new events. RESULTS: Between 2003 and 2008, 1633 patients and 1742 events were registered in ASTRAL. There was a preponderance of males, even in the elderly. Cardioembolic stroke was the most frequent type of stroke. Most strokes were of minor severity (National Institute of Health Stroke Scale [NIHSS] score ≤ 4 in 40.8% of patients). Cardioembolic stroke and dissections presented with the most severe clinical picture. There was a significant number of patients with unknown onset stroke, including wake-up stroke (n=568, 33.1%). Median time from last-well time to hospital arrival was 142 minutes for known onset and 759 minutes for unknown-onset stroke. The rate of intravenous or intraarterial thrombolysis between 2003 and 2008 increased from 10.8% to 20.8% in patients admitted within 24 hours of last-well time. Acute brain imaging was performed in 1695 patients (97.3%) within 24 hours. In 1358 patients (78%) who underwent acute computed tomography angiography, 717 patients (52.8%) had significant abnormalities. Of the 1068 supratentorial stroke patients who underwent acute perfusion computed tomography (61.3%), focal hypoperfusion was demonstrated in 786 patients (73.6%). CONCLUSIONS: This hospital-based prospective registry of consecutive acute ischemic strokes incorporates demographic, clinical, metabolic, acute perfusion, and arterial imaging. It is characterized by a high proportion of minor and unknown-onset strokes, short onset-to-admission time for known-onset patients, rapidly increasing thrombolysis rates, and significant vascular and perfusion imaging abnormalities in the majority of patients.
Resumo:
In Switzerland, organ procurement is well organized at the national-level but transplant outcomes have not been systematically monitored so far. Therefore, a novel project, the Swiss Transplant Cohort Study (STCS), was established. The STCS is a prospective multicentre study, designed as a dynamic cohort, which enrolls all solid organ recipients at the national level. The features of the STCS are a flexible patient-case system that allows capturing all transplant scenarios and collection of patient-specific and allograft-specific data. Beyond comprehensive clinical data, specific focus is directed at psychosocial and behavioral factors, infectious disease development, and bio-banking. Between May 2008 and end of 2011, the six Swiss transplant centers recruited 1,677 patients involving 1,721 transplantations, and a total of 1,800 organs implanted in 15 different transplantation scenarios. 10 % of all patients underwent re-transplantation and 3% had a second transplantation, either in the past or during follow-up. 34% of all kidney allografts originated from living donation. Until the end of 2011 we observed 4,385 infection episodes in our patient population. The STCS showed operative capabilities to collect high-quality data and to adequately reflect the complexity of the post-transplantation process. The STCS represents a promising novel project for comparative effectiveness research in transplantation medicine.
Resumo:
This thesis considers modeling and analysis of noise and interconnects in onchip communication. Besides transistor count and speed, the capabilities of a modern design are often limited by on-chip communication links. These links typically consist of multiple interconnects that run parallel to each other for long distances between functional or memory blocks. Due to the scaling of technology, the interconnects have considerable electrical parasitics that affect their performance, power dissipation and signal integrity. Furthermore, because of electromagnetic coupling, the interconnects in the link need to be considered as an interacting group instead of as isolated signal paths. There is a need for accurate and computationally effective models in the early stages of the chip design process to assess or optimize issues affecting these interconnects. For this purpose, a set of analytical models is developed for on-chip data links in this thesis. First, a model is proposed for modeling crosstalk and intersymbol interference. The model takes into account the effects of inductance, initial states and bit sequences. Intersymbol interference is shown to affect crosstalk voltage and propagation delay depending on bus throughput and the amount of inductance. Next, a model is proposed for the switching current of a coupled bus. The model is combined with an existing model to evaluate power supply noise. The model is then applied to reduce both functional crosstalk and power supply noise caused by a bus as a trade-off with time. The proposed reduction method is shown to be effective in reducing long-range crosstalk noise. The effects of process variation on encoded signaling are then modeled. In encoded signaling, the input signals to a bus are encoded using additional signaling circuitry. The proposed model includes variation in both the signaling circuitry and in the wires to calculate the total delay variation of a bus. The model is applied to study level-encoded dual-rail and 1-of-4 signaling. In addition to regular voltage-mode and encoded voltage-mode signaling, current-mode signaling is a promising technique for global communication. A model for energy dissipation in RLC current-mode signaling is proposed in the thesis. The energy is derived separately for the driver, wire and receiver termination.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
This study will concentrate on Product Data Management (PDM) systems, and sheet metal design features and classification. In this thesis, PDM is seen as an individual system which handles all product-related data and information. The meaning of relevant data is to take the manufacturing process further with fewer errors. The features of sheet metals are giving more information and value to the designed models. The possibility of implementing PDM and sheet metal features recognition are the core of this study. Their integration should make the design process faster and manufacturing-friendly products easier to design. The triangulation method is the basis for this research. The sections of this triangle are: scientific literature review, interview using the Delphi method and the author’s experience and observations. The main key findings of this study are: (1) the area of focus in triangle (the triangle of three different point of views: business, information exchange and technical) depends on the person’s background and their role in the company, (2) the classification in the PDM system (and also in the CAD system) should be done using the materials, tools and machines that are in use in the company and (3) the design process has to be more effective because of the increase of industrial production, sheet metal blank production and the designer’s time spent on actual design and (4) because Design For Manufacture (DFM) integration can be done with CAD-programs, DFM integration with the PDM system should also be possible.
Resumo:
A direct-driven permanent magnet synchronous machine for a small urban use electric vehicle is presented. The measured performance of the machine at the test bench as well as the performance over the modified New European Drive Cycle will be given. The effect of optimal current components, maximizing the efficiency and taking into account the iron loss, is compared with the simple id=0 – control. The machine currents and losses during the drive cycle are calculated and compared with each other.
Resumo:
This work is aimed at building an adaptable frame-based system for processing Dravidian languages. There are about 17 languages in this family and they are spoken by the people of South India.Karaka relations are one of the most important features of Indian languages. They are the semabtuco-syntactic relations between verbs and other related constituents in a sentence. The karaka relations and surface case endings are analyzed for meaning extraction. This approach is comparable with the borad class of case based grammars.The efficiency of this approach is put into test in two applications. One is machine translation and the other is a natural language interface (NLI) for information retrieval from databases. The system mainly consists of a morphological analyzer, local word grouper, a parser for the source language and a sentence generator for the target language. This work make contributios like, it gives an elegant account of the relation between vibhakthi and karaka roles in Dravidian languages. This mapping is elegant and compact. The same basic thing also explains simple and complex sentence in these languages. This suggests that the solution is not just ad hoc but has a deeper underlying unity. This methodology could be extended to other free word order languages. Since the frame designed for meaning representation is general, they are adaptable to other languages coming in this group and to other applications.