914 resultados para COHERENT OTDR
Resumo:
Background Accumulated biological research outcomes show that biological functions do not depend on individual genes, but on complex gene networks. Microarray data are widely used to cluster genes according to their expression levels across experimental conditions. However, functionally related genes generally do not show coherent expression across all conditions since any given cellular process is active only under a subset of conditions. Biclustering finds gene clusters that have similar expression levels across a subset of conditions. This paper proposes a seed-based algorithm that identifies coherent genes in an exhaustive, but efficient manner. Methods In order to find the biclusters in a gene expression dataset, we exhaustively select combinations of genes and conditions as seeds to create candidate bicluster tables. The tables have two columns: (a) a gene set, and (b) the conditions on which the gene set have dissimilar expression levels to the seed. First, the genes with less than the maximum number of dissimilar conditions are identified and a table of these genes is created. Second, the rows that have the same dissimilar conditions are grouped together. Third, the table is sorted in ascending order based on the number of dissimilar conditions. Finally, beginning with the first row of the table, a test is run repeatedly to determine whether the cardinality of the gene set in the row is greater than the minimum threshold number of genes in a bicluster. If so, a bicluster is outputted and the corresponding row is removed from the table. Repeating this process, all biclusters in the table are systematically identified until the table becomes empty. Conclusions This paper presents a novel biclustering algorithm for the identification of additive biclusters. Since it involves exhaustively testing combinations of genes and conditions, the additive biclusters can be found more readily.
Resumo:
This article proposes offence-specific guidelines for how prosecutorial discretion should be exercised in cases of voluntary euthanasia and assisted suicide. Similar guidelines have been produced in England and Wales but we consider them to be deficient in a number of respects, including that they lack a set of coherent guiding principles. In light of these concerns, we outline an approach to constructing alternative guidelines that begins with identifying three guiding principles that we argue are appropriate for this purpose: respect for autonomy, the need for high quality prosecutorial decision-making and the importance of public confidence in that decision-making.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
Vesicular and groundmass phyllosilicates in a hydrothermally altered basalt from the Point Sal ophiolite, California, have been studied using transmission electron microscopy (TEM). Pore-filling phyllosilicates are texturally characterized as having coherent, relatively thick and defect-free crystals of chlorite (14 Å) with occasional 24-Å periodicities. Groundmass phyllosilicates are texturally characterized as 1) randomly oriented crystals up to 200 Å in width and 2) larger, more coherent crystals up to 1000 Å in width. Small crystallites contain predominantly 14-Å layers with some 24-Å units. Large crystals show randomly interlayered chlorite/smectite (C/S), with approximately 50% chlorite on average. Adjacent smectite-like layers are not uncommon in the groundmass phyllosilicates. Electron microprobe analyses show that Fe/Mg ratios of both groundmass and vesicular phyllosilicates are fairly constant. Termination of brucite-like interlayers has been identified in some of the TEM images. The transformation mechanisms represented by these layer terminations are 1) growth of a brucite-like interlayer within smectite interlayer regions and 2) the dissolution and reprecipitation of elements to form chlorite layers. Both mechanisms require an increase in volume as smectite transforms to chlorite. The data, combined with that from previously published reports, suggest that randomly interlayered C/S is a metastable phase formed in microenvironments with low water/rock ratios. Chlorite forms in microenvironments in the same sample dominated by higher water/rock ratios. The relatively constant number of Mg's in the structure (Mg#) of both structures indicates that in both microenvironments the bulk rock composition has influence over the composition of phyllosilicates.
Resumo:
Intercalated Archean komatiites and dacites sit above a thick footwall dacite unit in the host rock succession at the Black Swan Nickel Mine, north of Kalgoorlie in the Yilgarn Craton, Western Australia. Both lithofacies occur in units that vary in scale from laterally extensive at the scale of the mine lease to localized, thin, irregular bodies, from > 100 m thick to only centimetres thick. Some dacites are only slightly altered and deformed, and are interpreted to post-date major deformation and alteration (late porphyries). However, the majority of the dacites display evidence of deformation, especially at contacts, and metamorphism, varying from silicification and chlorite alteration at contacts to pervasive low grade regional metamorphic alteration represented by common assemblages of chlorite, sericite and albite. Texturally, the dacites vary from entirely massive and coherent to partially brecciated to totally brecciated. Strangely, some dacites are coherent at the margins and brecciated internally. Breccia textures vary from cryptically defined, to blocky, closely packed, in situ jig-saw fit textures with secondary minerals in fractures between clasts, to more apparent matrix rich textures with round clast forms, giving apparent conglomerate textures. Some clast zones have multi-coloured clasts, giving the impression of varied provenance. Strangely however, all these textural variants have gradational relationships with each other, and no bedding or depositional structures are present. This indicates that all textures have an in situ origin. The komatiites are generally altered and pervasively carbonate veined. Preservation of original textures is patchy and local, but includes coarse adcumulate, mesocumulate, orthocumulate, crescumulate-harrisite and occasionally spinifex textures. Where original contacts between komatiites and dacites are preserved intact (i.e. not sheared or overprinted by alteration), the komatiites have chilled margins, whereas the dacites do not. The margins of the dacites are commonly silicified, and inclusions of dacite occur in komatiite, even at the top contacts of komatiite units, but komatiite clasts do not occur in the dacites. The komatiites therefore were emplaced as sills into the dacites, and the intercalated relationships are interpreted as intrusive. The brecciation and alteration in the dacites are interpreted as being largely due to hydraulic fracturing and alteration induced by contact metamorphic effects and hydrothermal alteration deriving from the intrusion of komatiites into the felsic pile. The absence of autobreccia and hyaloclastite textures in the dacites suggest that they were emplaced as an earlier intrusive (sill?) complex at a high level in the crust.
Resumo:
Cu/Ni/W nanolayered composites with individual layer thickness ranging from 5nm to 300nm were prepared by a magnetron sputtering system. Microstructures and strength of the nanolayered composites were investigated by using the nanoindentation method combined with theoretical analysis. Microstructure characterization revealed that the Cu/Ni/W composite consists of a typical Cu/Ni coherent interface and Cu/W and Ni/W incoherent interfaces. Cu/Ni/W composites have an ultrahigh strength and a large strengthening ability compared with bi-constituent Cu–X(X¼Ni, W, Au, Ag, Cr, Nb, etc.) nanolayered composites. Summarizing the present results and those reported in the literature, we systematically analyze the origin of the ultrahigh strength and its length scale dependence by taking into account the constituent layer properties, layer scales and heterogeneous layer/layer interface characteristics, including lattice and modulus mismatch as well as interface structure.
Resumo:
A well-developed brand helps to establish a solid identity and creates support to an image that is coherent to the actual motivations in an institution. Educational institutions have inherent characteristics that are diverse from the other sort of institutions, mainly when the focus is set on its internal and external publics. Consequently, these institutions should deal with the development of their brand and identity system also in a different approach. This research aims to investigate the traditional methodology for brand and identity systems development and proposes some modifications in order to allow a broader inclusion of the stakeholders in the process. The implementation of the new Oceanography Course in the Federal University of Bahia (UFBA) offered a unique opportunity to investigate and test these new strategies. In order to investigate and relate the image, identity, interaction and experience concepts through a participative methodology, this research project applies the new suggested strategies in the development of a brand and an identity system for the Oceanography Course in UFBA. Open surveys have been carried out between the alumni, lecturers and coordination body, in order to discover and establish a symbol for the course. The statistic analysis of the surveys’ results showed clear aesthetic preferences to some icons and colours to represent the course. The participative methodology celebrated, in this project, a democratization of the generally expert-centred brand development process.
Resumo:
This paper provides a new general approach for defining coherent generators in power systems based on the coherency in low frequency inter-area modes. The disturbance is considered to be distributed in the network by applying random load changes which is the random walk representation of real loads instead of a single fault and coherent generators are obtained by spectrum analysis of the generators velocity variations. In order to find the coherent areas and their borders in the inter-connected networks, non-generating buses are assigned to each group of coherent generator using similar coherency detection techniques. The method is evaluated on two test systems and coherent generators and areas are obtained for different operating points to provide a more accurate grouping approach which is valid across a range of realistic operating points of the system.
Resumo:
X-ray microtomography (micro-CT) with micron resolution enables new ways of characterizing microstructures and opens pathways for forward calculations of multiscale rock properties. A quantitative characterization of the microstructure is the first step in this challenge. We developed a new approach to extract scale-dependent characteristics of porosity, percolation, and anisotropic permeability from 3-D microstructural models of rocks. The Hoshen-Kopelman algorithm of percolation theory is employed for a standard percolation analysis. The anisotropy of permeability is calculated by means of the star volume distribution approach. The local porosity distribution and local percolation probability are obtained by using the local porosity theory. Additionally, the local anisotropy distribution is defined and analyzed through two empirical probability density functions, the isotropy index and the elongation index. For such a high-resolution data set, the typical data sizes of the CT images are on the order of gigabytes to tens of gigabytes; thus an extremely large number of calculations are required. To resolve this large memory problem parallelization in OpenMP was used to optimally harness the shared memory infrastructure on cache coherent Non-Uniform Memory Access architecture machines such as the iVEC SGI Altix 3700Bx2 Supercomputer. We see adequate visualization of the results as an important element in this first pioneering study.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.
Resumo:
The purpose of this research was to develop a theoretical understanding of the social phenomenon of the employment of foreign carers for older Taiwanese in households. Foreign carers were introduced into Taiwan in 1992 to address the care needs of the older population. By 2012, over 200,000 foreign caregivers from Indonesia, Philippines, and Vietnam were providing care in households in Taiwan. There has been little research on the interactions between and experiences of family employers, foreign carers and older persons receiving care. The theoretical framework brought together symbolic interactionist concepts and the social constructionism of Berger and Luckmann. Data collection and analysis were informed by Charmaz‘s formulation of grounded theory. Two focus groups and 54 in-depth interviews with a total of 57 Indonesian and Vietnamese foreign carers, Taiwanese family employers and older persons receiving care were undertaken. The analytical findings of the research reflect the ways in which the foreign carer, older persons receiving care and family employer participants were socially situated within the research context and how their respective social realities were shaped differently by changing social structures and cultural values within a globalising context. (Re)-regulating care was generated as the core category, forming a coherent and overarching framework that integrated the three analytical dimensions of the reality of the social change, resituating roles and struggling for control. The reality of social change refers to the employment of foreign carers as a manifestation of the reshaping of the social worlds of the three groups of participants. Resituating roles reflects the processes that underpin the hierarchical positioning of participants, the resultant asymmetrical power relations and associated interactions. Struggling for control, depicts how each group employed strategies to create space and identities that would sustain a sense of self and autonomy. In the current situation of economic and social change in Taiwan the three participant groups shared a desire for control. The autonomy of the women employers was negotiated through employment of foreign carers; for the foreign carers, a pragmatic decision to work abroad became a means for personal empowerment; and the older persons receiving care regained some authority through relationships with carers.
Resumo:
Postgraduate candidates in the creative arts encounter unique challenges when writing an exegesis (the written document that accompanies creative work as a thesis). As a practitioner-researcher, they must adopt a dual perspective–looking out towards an established field of research, exemplars and theories, as well as inwards towards their experiential creative processes and practice. This dual orientation provides clear benefits, for it enables them to situate the research within its field and make objective claims for the research methodologies and outcomes while maintaining an intimate, voiced relationship with the practice. However, a dual orientation introduces considerable complexities in the writing. It requires a reconciliation of multi-perspectival subject positions: the disinterested academic posture of the observer/ethnographer/analyst/theorist at times; and the invested, subjective stance the practitioner/producer at others. It requires the author to negotiate a range of writing styles and speech genres–from the formal, polemical style of the theorist to the personal, questioning and emotive voice of reflexivity. Moreover, these multi-variant orientations, subject positions, styles and voices must be integrated into a unified and coherent text. In this chapter I offer a conceptual framework and strategies for approaching this relatively new genre of thesis. I begin by summarizing the characteristics of what has begun to emerge as the predominant model of exegesis (the dual-oriented ‘Connective’ exegesis). Framing it against theoretical and philosophical understandings of polyvocality and matrixicality, I go on to point to recent textual models that provide precedents for connecting differently oriented perspectives, subjectivities and voices. I then turn to emergent archives of practice-led research to explain how the challenge of writing a ‘Connective’ exegesis has so far been resolved by higher degree research (HDR) candidates. Exemplars illustrate a range of strategies they have used to compose a multi-perspectival text, reconcile the divergent subject positions of the practitioner researcher, and harmonize the speech genres of a ployvocal text.
Resumo:
This book provides a general framework for specifying, estimating, and testing time series econometric models. Special emphasis is given to estimation by maximum likelihood, but other methods are also discussed, including quasi-maximum likelihood estimation, generalized method of moments estimation, nonparametric estimation, and estimation by simulation. An important advantage of adopting the principle of maximum likelihood as the unifying framework for the book is that many of the estimators and test statistics proposed in econometrics can be derived within a likelihood framework, thereby providing a coherent vehicle for understanding their properties and interrelationships. In contrast to many existing econometric textbooks, which deal mainly with the theoretical properties of estimators and test statistics through a theorem-proof presentation, this book squarely addresses implementation to provide direct conduits between the theory and applied work.
Resumo:
Development and application of inorganic adsorbent materials have been continuously investigated due to their variability and versatility. This Master thesis has expanded the knowledge in the field of adsorption targeting radioactive iodine waste and proteins using modified inorganic materials. Industrial treatment of radioactive waste and safety disposal of nuclear waste is a constant concern around the world with the development of radioactive materials applications. To address the current problems, laminar titanate with large surface area (143 m2 g−1) was synthesized from inorganic titanium compounds by hydrothermal reactions at 433 K. Ag2O nanocrystals of particle size ranging from 5–30 nm were anchored on the titanate lamina surface which has crystallographic similarity to that of Ag2O nanocrystals. Therefore, the deposited Ag2O nanocrystals and titanate substrate could join together at these surfaces between which there forms a coherent interface. Such coherence between the two phases reduces the overall energy by minimizing surface energy and maintains the Ag2O nanocrystals firmly on the outer surface of the titanate structure. The combined adsorbent was then applied as efficient adsorbent to remove radioactive iodine from water (one gram adsorbent can capture up to 3.4 mmol of I- anions) and the composite adsorbent can be recovered easily for safe disposal. The structure changes of the titanate lamina and the composite adsorbent were characterized via various techniques. The isotherm and kinetics of iodine adsorption, competitive adsorption and column adsorption using the adsorbent were studied to determine the iodine removal abilities of the adsorbent. It is shown that the adsorbent exhibited excellent trapping ability towards iodine in the fix-bed column despite the presence of competitive ions. Hence, Ag2O deposited titanate lamina could serve as an effective adsorbent for removing iodine from radioactive waste. Surface hydroxyl group of the inorganic materials is widely applied for modification purposes and modification of inorganic materials for biomolecule adsorption can also be achieved. Specifically, γ-Al2O3 nanofibre material is converted via calcinations from boehmite precursor which is synthesised by hydrothermal chemical reactions under directing of surfactant. These γ-Al2O3 nanofibres possess large surface area (243 m2 g-1), good stability under extreme chemical conditions, good mechanical strength and rich surface hydroxyl groups making it an ideal candidate in industrialized separation column. The fibrous morphology of the adsorbent also guarantees facile recovery from aqueous solution under both centrifuge and sedimentation approaches. By chemically bonding the dyes molecules, the charge property of γ-Al2O3 is changed in the aim of selectively capturing of lysozyme from chicken egg white solution. The highest Lysozyme adsorption amount was obtained at around 600 mg/g and its proportion is elevated from around 5% to 69% in chicken egg white solution. It was found from the adsorption test under different solution pH that electrostatic force played the key role in the good selectivity and high adsorption rate of surface modified γ-Al2O3 nanofibre adsorbents. Overall, surface modified fibrous γ-Al2O3 could be applied potentially as an efficient adsorbent for capturing of various biomolecules.
Resumo:
Electricity is the cornerstone of modern life. It is essential to economic stability and growth, jobs and improved living standards. Electricity is also the fundamental ingredient for a dignified life; it is the source of such basic human requirements as cooked food, a comfortable living temperature and essential health care. For these reasons, it is unimaginable that today's economies could function without electricity and the modern energy services that it delivers. Somewhat ironically, however, the current approach to electricity generation also contributes to two of the gravest and most persistent problems threatening the livelihood of humans. These problems are anthropogenic climate change and sustained human poverty. To address these challenges, the global electricity sector must reduce its reliance on fossil fuel sources. In this context, the object of this research is twofold. Initially it is to consider the design of the Renewable Energy (Electricity) Act 2000 (Cth) (Renewable Electricity Act), which represents Australia's primary regulatory approach to increase the production of renewable sourced electricity. This analysis is conducted by reference to the regulatory models that exist in Germany and Great Britain. Within this context, this thesis then evaluates whether the Renewable Electricity Act is designed effectively to contribute to a more sustainable and dignified electricity generation sector in Australia. On the basis of the appraisal of the Renewable Electricity Act, this thesis contends that while certain aspects of the regulatory regime have merit, ultimately its design does not represent an effective and coherent regulatory approach to increase the production of renewable sourced electricity. In this regard, this thesis proposes a number of recommendations to reform the existing regime. These recommendations are not intended to provide instantaneous or simple solutions to the current regulatory regime. Instead, the purpose of these recommendations is to establish the legal foundations for an effective regulatory regime that is designed to increase the production of renewable sourced electricity in Australia in order to contribute to a more sustainable and dignified approach to electricity production.