901 resultados para design or documentation process
Resumo:
A work on prophetic medicine (or the Prophet's medicine) by al-Maqdisī (d.1245) preceded by a short treatise of uncertain authorship on the beautiful names of God.
Resumo:
Supervision of psychotherapists and counselors, especially in the early years of practice, is widely accepted as being important for professional development and to ensure optimal client outcomes. Although the process of clinical supervision has been extensively studied, less is known about the impact of supervision on psychotherapy practice and client symptom outcome. This study evaluated the impact of clinical supervision on client working alliance and symptom reduction in the brief treatment of major depression. The authors randomly assigned 127 clients with a diagnosis of major depression to 127 supervised or unsupervised therapists to receive eight sessions of problems-solving treatment. Supervised therapists were randomly assigned to either alliance skill- or alliance process-focused supervision and received eight supervision sessions. Before beginning treatment, therapists received one supervision session for brief training in the working alliance supervision approach and in specific characteristics of each case. Standard measures of therapeutic alliance and symptom change were used as dependent variables. The results showed a significant effect for both supervision conditions on working alliance from the first session of therapy, symptom reduction, and treatment retention and evaluation but no effect differences between supervision conditions. It was not possible to separate the effects of supervision from the single pretreatment session and is possible that allegiance effects might have inflated results. The scientific and clinical relevance of these findings is discussed.
Resumo:
Cyclotides are a recently discovered class of proteins that have a characteristic head-to-tail cyclized backbone stabilized by a knotted arrangement of three disulfide bonds. They are exceptionally resistant to chemical, enzymatic and thermal treatments because of their unique structural scaffold. Cyclotides have a range of bio-activities, including uterotonic, anti-HIV, anti-bacterial and cytotoxic activity but their insecticidal properties suggest that their natural physiological role is in plant defense. They are genetically encoded as linear precursors and subsequently processed to produce mature cyclic peptides but the mechanism by which this occurs remains unknown. Currently most cyclotides are obtained via direct extraction from plants in the Rubiaceae and Violaceae families. To facilitate the screening of cyclotides for structure-activity studies and to exploit them in drug design or agricultural applications a convenient route for the synthesis of cyclotides is vital. In this review the current chemical, recombinant and biosynthetic routes to the production of cyclotides are discussed.
Resumo:
Review date: Review period January 1992-December 2001. Final analysis July 2004-January 2005. Background and review context: There has been no rigorous systematic review of the outcomes of early exposure to clinical and community settings in medical education. Objectives of review: (1) Identify published empirical evidence of the effects of early experience in medical education, analyse it, and synthesize conclusions from it. (2) Identify the strengths and limitations of the research effort to date, and identify objectives for future research. Search strategy: Ovid search of. BEI, ERIC, Medline, CIATAHL and EMBASE Additional electronic searches of: Psychinfo, Timelit, EBM reviews, SIGLE, and the Cochrane databases. Hand-searches of: Medical Education, Medical Teacher, Academic Medicine, Teaching and Learning in Medicine, Advances in Health Sciences Education, Journal of Educational Psychology. Criteria: Definitions: Experience: Authentic (real as opposed to simulated) human contact in a social or clinical context that enhances learning of health, illness and/or disease, and the role of the health professional. Early: What would traditionally have been regarded as the preclinical phase, usually the first 2 years. Inclusions: All empirical studies (verifiable, observational data) of early experience in the basic education of health professionals, whatever their design or methodology, including papers not in English. Evidence from other health care professions that could be applied to medicine was included. Exclusions: Not empirical; not early; post-basic; simulated rather than 'authentic' experience. Data collection: Careful validation of selection processes. Coding by two reviewers onto an extensively modified version of the standard BEME coding sheet. Accumulation into an Access database. Secondary coding and synthesis of an interpretation. Headline results: A total of 73 studies met the selection criteria and yielded 277 educational outcomes; 116 of those outcomes (from 38 studies) were rated strong and important enough to include in a narrative synthesis of results; 76% of those outcomes were from descriptive studies and 24% from comparative studies. Early experience motivated and satisfied students of the health professions and helped them acclimatize to clinical environments, develop professionally, interact with patients with more confidence and less stress, develop self-reflection and appraisal skill, and develop a professional identity. It strengthened their learning and made it more real and relevant to clinical practice. It helped students learn about the structure and function of the healthcare system, and about preventive care and the role of health professionals. It supported the learning of both biomedical and behavioural/social sciences and helped students acquire communication and basic clinical skills. There were outcomes for beneficiaries other than students, including teachers, patients, populations, organizations and specialties. Early experience increased recruitment to primary care/rural medical practice, though mainly in US studies which introduced it for that specific purpose as part of a complex intervention. Conclusions: Early experience helps medical students socialize to their chosen profession. It. helps them acquire a range of subject matter and makes their learning more real and relevant. It has potential benefits for other stakeholders, notably teachers and patients. It can influence career choices.
Resumo:
Count data with excess zeros relative to a Poisson distribution are common in many biomedical applications. A popular approach to the analysis of such data is to use a zero-inflated Poisson (ZIP) regression model. Often, because of the hierarchical Study design or the data collection procedure, zero-inflation and lack of independence may occur simultaneously, which tender the standard ZIP model inadequate. To account for the preponderance of zero counts and the inherent correlation of observations, a class of multi-level ZIP regression model with random effects is presented. Model fitting is facilitated using an expectation-maximization algorithm, whereas variance components are estimated via residual maximum likelihood estimating equations. A score test for zero-inflation is also presented. The multi-level ZIP model is then generalized to cope with a more complex correlation structure. Application to the analysis of correlated count data from a longitudinal infant feeding study illustrates the usefulness of the approach.
Resumo:
This article presents a three-dimensional definition space of the group development literature that differentiates group development models on three dimensions: content, population, and path dependency. The multidimensional conceptualization structures and integrates the vast group development literature, enabling direct comparison of competing theories. The utility of this definition space is demonstrated by using the relative positioning of two seemingly competing group development models-the punctuated equilibrium model and the integrative model-to demonstrate their complementarity. The authors also show how organizational researchers and practitioners can use the three-dimensional definition space to select an appropriate theoretical model for the group or group process with which they are working.
Resumo:
Background The production of high yields of recombinant proteins is an enduring bottleneck in the post-genomic sciences that has yet to be addressed in a truly rational manner. Typically eukaryotic protein production experiments have relied on varying expression construct cassettes such as promoters and tags, or culture process parameters such as pH, temperature and aeration to enhance yields. These approaches require repeated rounds of trial-and-error optimization and cannot provide a mechanistic insight into the biology of recombinant protein production. We published an early transcriptome analysis that identified genes implicated in successful membrane protein production experiments in yeast. While there has been a subsequent explosion in such analyses in a range of production organisms, no one has yet exploited the genes identified. The aim of this study was to use the results of our previous comparative transcriptome analysis to engineer improved yeast strains and thereby gain an understanding of the mechanisms involved in high-yielding protein production hosts. Results We show that tuning BMS1 transcript levels in a doxycycline-dependent manner resulted in optimized yields of functional membrane and soluble protein targets. Online flow microcalorimetry demonstrated that there had been a substantial metabolic change to cells cultured under high-yielding conditions, and in particular that high yielding cells were more metabolically efficient. Polysome profiling showed that the key molecular event contributing to this metabolically efficient, high-yielding phenotype is a perturbation of the ratio of 60S to 40S ribosomal subunits from approximately 1:1 to 2:1, and correspondingly of 25S:18S ratios from 2:1 to 3:1. This result is consistent with the role of the gene product of BMS1 in ribosome biogenesis. Conclusion This work demonstrates the power of a rational approach to recombinant protein production by using the results of transcriptome analysis to engineer improved strains, thereby revealing the underlying biological events involved.
Resumo:
The International Cooperation Agency (identified in this article as IDEA) working in Colombia is one of the most important in Colombian society with programs that support gender rights, human rights, justice and peace, scholarships, aboriginal population, youth, afro descendants population, economic development in communities, and environmental development. The identified problem is based on the diversified offer of services, collaboration and social intervention which requires diverse groups of people with multiple agendas, ways to support their mandates, disciplines, and professional competences. Knowledge creation and the growth and sustainability of the organization can be in danger because of a silo culture and the resulting reduced leverage of the separate group capabilities. Organizational memory is generally formed by the tacit knowledge of the organization members, given the value of accumulated experience that this kind of social work implies. Its loss is therefore a strategic and operational risk when most problem interventions rely on direct work in the socio-economic field and living real experiences with communities. The knowledge management solution presented in this article starts first, with the identification of the people and groups concerned and the creation of a knowledge map as a means to strengthen the ties between organizational members; second, by introducing a content management system designed to support the documentation process and knowledge sharing process; and third, introducing a methodology for the adaptation of a Balanced Scorecard based on the knowledge management processes. These three main steps lead to a knowledge management “solution” that has been implemented in the organization, comprising three components: a knowledge management system, training support and promotion of cultural change.
Resumo:
Liquid-liquid extraction has long been known as a unit operation that plays an important role in industry. This process is well known for its complexity and sensitivity to operation conditions. This thesis presents an attempt to explore the dynamics and control of this process using a systematic approach and state of the art control system design techniques. The process was studied first experimentally under carefully selected. operation conditions, which resembles the ranges employed practically under stable and efficient conditions. Data were collected at steady state conditions using adequate sampling techniques for the dispersed and continuous phases as well as during the transients of the column with the aid of a computer-based online data logging system and online concentration analysis. A stagewise single stage backflow model was improved to mimic the dynamic operation of the column. The developed model accounts for the variation in hydrodynamics, mass transfer, and physical properties throughout the length of the column. End effects were treated by addition of stages at the column entrances. Two parameters were incorporated in the model namely; mass transfer weight factor to correct for the assumption of no mass transfer in the. settling zones at each stage and the backmixing coefficients to handle the axial dispersion phenomena encountered in the course of column operation. The parameters were estimated by minimizing the differences between the experimental and the model predicted concentration profiles at steady state conditions using non-linear optimisation technique. The estimated values were then correlated as functions of operating parameters and were incorporated in·the model equations. The model equations comprise a stiff differential~algebraic system. This system was solved using the GEAR ODE solver. The calculated concentration profiles were compared to those experimentally measured. A very good agreement of the two profiles was achieved within a percent relative error of ±2.S%. The developed rigorous dynamic model of the extraction column was used to derive linear time-invariant reduced-order models that relate the input variables (agitator speed, solvent feed flowrate and concentration, feed concentration and flowrate) to the output variables (raffinate concentration and extract concentration) using the asymptotic method of system identification. The reduced-order models were shown to be accurate in capturing the dynamic behaviour of the process with a maximum modelling prediction error of I %. The simplicity and accuracy of the derived reduced-order models allow for control system design and analysis of such complicated processes. The extraction column is a typical multivariable process with agitator speed and solvent feed flowrate considered as manipulative variables; raffinate concentration and extract concentration as controlled variables and the feeds concentration and feed flowrate as disturbance variables. The control system design of the extraction process was tackled as multi-loop decentralised SISO (Single Input Single Output) as well as centralised MIMO (Multi-Input Multi-Output) system using both conventional and model-based control techniques such as IMC (Internal Model Control) and MPC (Model Predictive Control). Control performance of each control scheme was. studied in terms of stability, speed of response, sensitivity to modelling errors (robustness), setpoint tracking capabilities and load rejection. For decentralised control, multiple loops were assigned to pair.each manipulated variable with each controlled variable according to the interaction analysis and other pairing criteria such as relative gain array (RGA), singular value analysis (SVD). Loops namely Rotor speed-Raffinate concentration and Solvent flowrate Extract concentration showed weak interaction. Multivariable MPC has shown more effective performance compared to other conventional techniques since it accounts for loops interaction, time delays, and input-output variables constraints.
Resumo:
The invention provides methods and apparatus for thermal treatment, e.g. for pyrolysis of lignin. The lignin is provided to a reaction chamber as a paste, which can reduce or avoid process difficulties encountered when heating lignin.
Resumo:
Full text This Proceedings volume contains selected papers from the Fourth International CIRP-sponsored, Conference on Digital Enterprise Technology (DET2007), which was held at the University of Bath, UK, 19–21 September 2007. All selected papers have been suitably enhanced for publication in the Journal and have undergone full review. Digital enterprise technology (DET) is ‘the collection of systems and methods for the digital modelling and analysis of the global product development and realization process, in the context of lifecycle management.’ The principal aim of the DET concept is to provide a coherent context for the development and integration of the various digital technologies that underpin modern design and manufacturing. These technologies can be classified according to the following five key areas. 1. Distributed and collaborative design. 2. Process modelling and process planning. 3. Advanced factory design and modelling. 4. Physical-to-digital environment integrators–verification. 5. Enterprise integration technologies. This special issue is representative of the wide breadth of the DET concept including; a comprehensive review of digital engineering, design processes, digital modelling of machine tools, forming, robotics and machining processes, verification and metrology, and dynamic networks. It is particularly pleasing to see the development of metrology as a key aspect of modern manufacturing technology, linking design intent to process capability. The papers published herein will facilitate the exploration of new and evolving research concepts by the international research community and will influence the development of international standards for the application of DET technologies.
Resumo:
The introduction of phase change material fluid and nanofluid in micro-channel heat sink design can significantly increase the cooling capacity of the heat sink because of the unique features of these two kinds of fluids. To better assist the design of a high performance micro-channel heat sink using phase change fluid and nanofluid, the heat transfer enhancement mechanism behind the flow with such fluids must be completely understood. ^ A detailed parametric study is conducted to further investigate the heat transfer enhancement of the phase change material particle suspension flow, by using the two-phase non-thermal-equilibrium model developed by Hao and Tao (2004). The parametric study is conducted under normal conditions with Reynolds numbers of Re = 90–600 and phase change material particle concentrations of ϵp ≤ 0.25, as well as extreme conditions of very low Reynolds numbers (Re < 50) and high phase change material particle concentration (ϵp = 50%–70%) slurry flow. By using the two newly-defined parameters, named effectiveness factor ϵeff and performance index PI, respectively, it is found that there exists an optimal relation between the channel design parameters L and D, particle volume fraction ϵp, Reynolds number Re, and the wall heat flux qw. The influence of the particle volume fraction ϵp, particle size dp, and the particle viscosity μ p, to the phase change material suspension flow, are investigated and discussed. The model was validated by available experimental data. The conclusions will assist designers in making their decisions that relate to the design or selection of a micro-pump suitable for micro or mini scale heat transfer devices. ^ To understand the heat transfer enhancement mechanism of the nanofluid flow from the particle level, the lattice Boltzmann method is used because of its mesoscopic feature and its many numerical advantages. By using a two-component lattice Boltzmann model, the heat transfer enhancement of the nanofluid is analyzed, through incorporating the different forces acting on the nanoparticles to the two-component lattice Boltzmann model. It is found that the nanofluid has better heat transfer enhancement at low Reynolds numbers, and the Brownian motion effect of the nanoparticles will be weakened by the increase of flow speed. ^
Resumo:
Memory (cache, DRAM, and disk) is in charge of providing data and instructions to a computer's processor. In order to maximize performance, the speeds of the memory and the processor should be equal. However, using memory that always match the speed of the processor is prohibitively expensive. Computer hardware designers have managed to drastically lower the cost of the system with the use of memory caches by sacrificing some performance. A cache is a small piece of fast memory that stores popular data so it can be accessed faster. Modern computers have evolved into a hierarchy of caches, where a memory level is the cache for a larger and slower memory level immediately below it. Thus, by using caches, manufacturers are able to store terabytes of data at the cost of cheapest memory while achieving speeds close to the speed of the fastest one.^ The most important decision about managing a cache is what data to store in it. Failing to make good decisions can lead to performance overheads and over-provisioning. Surprisingly, caches choose data to store based on policies that have not changed in principle for decades. However, computing paradigms have changed radically leading to two noticeably different trends. First, caches are now consolidated across hundreds to even thousands of processes. And second, caching is being employed at new levels of the storage hierarchy due to the availability of high-performance flash-based persistent media. This brings four problems. First, as the workloads sharing a cache increase, it is more likely that they contain duplicated data. Second, consolidation creates contention for caches, and if not managed carefully, it translates to wasted space and sub-optimal performance. Third, as contented caches are shared by more workloads, administrators need to carefully estimate specific per-workload requirements across the entire memory hierarchy in order to meet per-workload performance goals. And finally, current cache write policies are unable to simultaneously provide performance and consistency guarantees for the new levels of the storage hierarchy.^ We addressed these problems by modeling their impact and by proposing solutions for each of them. First, we measured and modeled the amount of duplication at the buffer cache level and contention in real production systems. Second, we created a unified model of workload cache usage under contention to be used by administrators for provisioning, or by process schedulers to decide what processes to run together. Third, we proposed methods for removing cache duplication and to eliminate wasted space because of contention for space. And finally, we proposed a technique to improve the consistency guarantees of write-back caches while preserving their performance benefits.^
Resumo:
The goal of the research is to provide an overview of those factors that play a major role in structural failures and also to focus on the importance that bracing has in construction accidents. A temporary bracing system is important to construction safety, yet it is often neglected. Structural collapses often occur due to the insufficient support of loads that are applied at the time of failure. The structural load is usually analyzed by conceiving the whole structure as a completed entity, and there is frequently a lack of design or proper implementation of systems that can provide stability during construction. Often, the specific provisions and requirements of temporary bracing systems are left to the workers on the job site that may not have the qualifications or expertise for proper execution. To effectively see if bracing design should get more attention in codes and standards, failures which could have been avoided with the presence and/or the correct design of a bracing system were searched and selected among a variety of cases existing in the engineering literature. Eleven major cases were found, which span in a time frame of almost 70 years, clearly showing that the topic should get more attention. The case studies are presented in chronological order and in a systematic way. The failed structure is described in its design components and the sequence of failure is reconstructed. Then, the causes and failure mechanism are presented. Advice on how to avoid similar failures from happening again and hypothetic solutions which could have prevented the collapses are identified. The findings shows that insufficient or nonexistent bracing mainly results from human negligence or miscalculation of the load analysis and show that time has come to fully acknowledge that temporary structures should be more accounted for in design and not left to contractors' means and methods of construction.
Resumo:
Speckle is being used as a characterization tool for the analysis of the dynamic of slow varying phenomena occurring in biological and industrial samples. The retrieved data takes the form of a sequence of speckle images. The analysis of these images should reveal the inner dynamic of the biological or physical process taking place in the sample. Very recently, it has been shown that principal component analysis is able to split the original data set in a collection of classes. These classes can be related with the dynamic of the observed phenomena. At the same time, statistical descriptors of biospeckle images have been used to retrieve information on the characteristics of the sample. These statistical descriptors can be calculated in almost real time and provide a fast monitoring of the sample. On the other hand, principal component analysis requires longer computation time but the results contain more information related with spatial-temporal pattern that can be identified with physical process. This contribution merges both descriptions and uses principal component analysis as a pre-processing tool to obtain a collection of filtered images where a simpler statistical descriptor can be calculated. The method has been applied to slow-varying biological and industrial processes