995 resultados para Galton-Watson branching process
Resumo:
A direct method of preparing cast aluminium alloy-graphite particle composites using uncoated graphite particles is reported. The method consists of introducing and dispersing uncoated but suitably pretreated graphite particles in aluminium alloy melts, and casting the resulting composite melts in suitable permanent moulds. The optical pretreatment required for the dispersion of the uncoated graphite particles in aluminium alloy melts consists of heating the graphite particles to 400° C in air for 1 h just prior to their dispersion in the melts. The effects of alloying elements such as Si, Cu and Mg on the dispersability of pretreated graphite in molten aluminium have also been reported. It was found that additions of about 0.5% Mg or 5% Si significantly improve the dispersability of graphite particles in aluminium alloy melts as indicated by the high recoveries of graphite in the castings of these composites. It was also possible to disperse upto 3% graphite in LM 13 alloy melts and retain the graphite particles in a well distributed fashion in the castings using the pre-heat-treated graphite particles. The observations in this study have been related to the information presently available on wetting between graphite and molten aluminium in the presence of different elements and our own thermogravimetric analysis studies on graphite particles. Physical and mechanical properties of LM 13-3% graphite composite made using pre-heat-treated graphite powder, were found to be adequate for many applications, including pistons which have been successfully used in internal combustion engines.
Resumo:
Considers the magnetic response of a charged Brownian particle undergoing a stochastic birth-death process. The latter simulates the electron-hole pair production and recombination in semiconductors. The authors obtain non-zero, orbital diamagnetism which can be large without violating the Van Leeuwen theorem (1921).
Resumo:
The aim of this study was to examine the actions of geographically dispersed process stakeholders (doctors, community pharmacists and RACFs) in order to cope with the information silos that exist within and across different settings. The study setting involved three metropolitan RACFs in Sydney, Australia and employed a qualitative approach using semi-structured interviews, non-participant observations and artefact analysis. Findings showed that medication information was stored in silos which required specific actions by each setting to translate this information to fit their local requirements. A salient example of this was the way in which community pharmacists used the RACF medication charts to prepare residents' pharmaceutical records. This translation of medication information across settings was often accompanied by telephone or face-to-face conversations to cross-check, validate or obtain new information. Findings highlighted that technological interventions that work in silos can negatively impact the quality of medication management processes in RACF settings. The implementation of commercial software applications like electronic medication charts need to be appropriately integrated to satisfy the collaborative information requirements of the RACF medication process.
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.
Resumo:
An important question which has to be answered in evaluting the suitability of a microcomputer for a control application is the time it would take to execute the specified control algorithm. In this paper, we present a method of obtaining closed-form formulas to estimate this time. These formulas are applicable to control algorithms in which arithmetic operations and matrix manipulations dominate. The method does not require writing detailed programs for implementing the control algorithm. Using this method, the execution times of a variety of control algorithms on a range of 16-bit mini- and recently announced microcomputers are calculated. The formulas have been verified independently by an analysis program, which computes the execution time bounds of control algorithms coded in Pascal when they are run on a specified micro- or minicomputer.
Resumo:
Photography is now a highly automated activity where people enjoy phototaking by pointing and pressing a button. While this liberates people from having to interact with the processes of photography, e.g., controlling the parameters of the camera or printing images in the darkroom, we argue that an engagement with such processes can in fact enrich people's experience of phototaking. Drawing from fieldwork with members of a film-based photography club, we found that people who engage deeply with the various processes of phototaking experienced photography richly and meaningfully. Being able to participate fully in the entire process gave them a sense of achievement over the final result. Having the opportunity to engage with the process also allowed them to learn and hone their photographic skills. Through this understanding, we can imagine future technologies that enrich experiences of photography through providing the means to interact with photographic processes in new ways.
Resumo:
The change in energy during hydrogen abstraction by ketones is estimated for different electronic states as a function of the intermolecular orbital overlap employing perturbation theory. The results suggest that ketones preferentially undergo the in-plane reaction and abstract a hydrogen atom in their triplet nπ* state. For ketones where the triplet ππ* state lies below the triplet nπ* state, hydrogen abstraction can take place in the ππ* state owing to the crossing of the zero order reaction surfaces of the nπ* and ππ* states.
Resumo:
The phenomenon of branching at specific angles in streamer breakdown studies is found to be more universal than was ever thought. The angles measured in the breakdown of gases show that the coefficient of field distortion, K, lies in the range from 1 to less than 0.1. The values of K, so obtained agree well with those envisaged in the criterion of the streamer mechanism. It is hoped that branching angles observed in various types of breakdown may, possibly, be explained similarly.
Resumo:
The respiratory chain is found in the inner mitochondrial membrane of higher organisms and in the plasma membrane of many bacteria. It consists of several membrane-spanning enzymes, which conserve the energy that is liberated from the degradation of food molecules as an electrochemical proton gradient across the membrane. The proton gradient can later be utilized by the cell for different energy requiring processes, e.g. ATP production, cellular motion or active transport of ions. The difference in proton concentration between the two sides of the membrane is a result of the translocation of protons by the enzymes of the respiratory chain, from the negatively charged (N-side) to the positively charged side (P-side) of the lipid bilayer, against the proton concentration gradient. The endergonic proton transfer is driven by the flow of electrons through the enzymes of the respiratory chain, from low redox-potential electron donors to acceptors of higher potential, and ultimately to oxygen. Cytochrome c oxidase is the last enzyme in the respiratory chain and catalyzes the reduction of dioxygen to water. The redox reaction is coupled to proton transport across the membrane by a yet unresolved mechanism. Cytochrome c oxidase has two proton-conducting pathways through which protons are taken up to the interior part of the enzyme from the N-side of the membrane. The K-pathway transfers merely substrate protons, which are consumed in the process of water formation at the catalytic site. The D-pathway transfers both substrate protons and protons that are pumped to the P-side of the membrane. This thesis focuses on the role of two conserved amino acids in proton translocation by cytochrome c oxidase, glutamate 278 and tryptophan 164. Glu278 is located at the end of the D-pathway and is thought to constitute the branching point for substrate and pumped protons. In this work, it was shown that although Glu278 has an important role in the proton transfer mechanism, its presence is not an obligatory requirement. Alternative structural solutions in the area around Glu278, much like the ones present in some distantly related heme-copper oxidases, could in the absence of Glu278 support the formation of a long hydrogen-bonded water chain through which proton transfer from the D-pathway to the catalytic site is possible. The other studied amino acid, Trp164, is hydrogen bonded to the ∆-propionate of heme a3 of the catalytic site. Mutation of this amino acid showed that it may be involved in regulation of proton access to a proton acceptor, a pump site, from which the proton later is expelled to the P-side of the membrane. The ion pair that is formed by the ∆-propionate of heme a3 and arginine 473 is likely to form a gate-like structure, which regulates proton mobility to the P-side of the membrane. The same gate may also be part of an exit path through which water molecules produced at the catalytically active site are removed towards the external side of the membrane. Time-resolved optical and electrometrical experiments with the Trp164 to phenylalanine mutant revealed a so far undetected step in the proton pumping mechanism. During the A to PR transition of the catalytic cycle, a proton is transferred from Glu278 to the pump site, located somewhere in the vicinity of the ∆-propionate of heme a3. A mechanism for proton pumping by cytochrome c oxidase is proposed on the basis of the presented results and the mechanism is discussed in relation to some relevant experimental data. A common proton pumping mechanism for all members of the heme-copper oxidase family is moreover considered.
Resumo:
Neuronal plasticity is a well characterized phenomenon in the developing and adult brain. It refers to capasity of a single neuron to modify morphology, synaptic connections and activity. Neuronal connections and capacity for plastic events are compromised in several pathological disorders, such as major depression. In addition, neuronal atrophy has been reported in depressive patients. Neurotrophins are a group of secretory proteins functionally classified as neuronal survival factors. Neurotrophins, especially brain derived neurotrophic factor (BDNF), have also been associated with promoting neuronal plasticity in dysfunctional neuronal networks. Chronic antidepressant treatment increases plastic events including neurogenesis and arborization and branching of neurites in distinct brain areas, such as the hippocampus. One suggested mode of action is where the antidepressants elevate the synaptic levels of BDNF thus further activating several signaling cascades via trkB-receptor. In our studies we have tried to clarify the mechanisms of action for antidepressants and to resolve the role of BDNF in this process. We found that chronic antidepressant treatment increases amount of markers of neuronal plasticity in both hippocampus and in the medial prefrontal cortex, both of which are closely linked to the etiology of major depression. Secondary actions of antidepressants include rapid activation of the trkB receptor followed by a phosphorylation of transcription factor CREB. In addition, activation of CREB by phosphorylation appears responsible for the regulation of the expression of the BDNF gene. Using transgenic mice we found that BDNF-induced trkB-mediated signaling proved crucial for the behavioral effects of antidepressants in the forced swimming test and for the survival of newly-born neurons in the adult hippocampus. Antidepressants not only increased neurogenesis in the adult hippocampus but also elevated the turnover of hippocampal neurons. During these studies we also discovered that another trkB ligand, NT-4, is involved in morphine-mediated anti-nociception and tolerance. These results present a novel role for trkB-mediated signaling in plastic events present in the opioid system. This thesis evaluates neuronal plasticity and trkB as a target for future antidepressant treatments.
Resumo:
My System of Career Influences (MSCI) is a qualitative guided reflection process for adolescents and for adults that is based on the Systems Theory Framework (STF; McMahon & Patton, 1995; Patton & McMahon, 1999, 2006, 2014) of career development. Reflective of the trend towards more holistic theories and models of career counselling, the MSCI enables users to identify, prioritise and story their career influences, thus enabling them to contextualise career decisions and career transitions.
Resumo:
In the field of workplace air quality, measuring and analyzing the size distribution of airborne particles to identify their sources and apportion their contribution has become widely accepted, however, the driving factors that influence this parameter, particularly for nanoparticles (< 100 nm), have not been thoroughly determined. Identification of driving factors, and in turn, general trends in size distribution of emitted particles would facilitate the prediction of nanoparticles’ emission behavior and significantly contribute to their exposure assessment. In this study, a comprehensive analysis of the particle number size distribution data, with a particular focus on the ultrafine size range of synthetic clay particles emitted from a jet milling machine was conducted using the multi-lognormal fitting method. The results showed relatively high contribution of nanoparticles to the emissions in many of the tested cases, and also, that both surface treatment and feed rate of the machine are significant factors influencing the size distribution of the emitted particles of this size. In particular, applying surface treatments and increasing the machine feed rate have the similar effect of reducing the size of the particles, however, no general trend was found in variations of size distribution across different surface treatments and feed rates. The findings of our study demonstrate that for this process and other activities, where no general trend is found in the size distribution of the emitted airborne particles due to dissimilar effects of the driving factors, each case must be treated separately in terms of workplace exposure assessment and regulations.
Resumo:
Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.