877 resultados para Morrison Toni
Resumo:
This paper seeks to review the operation of Australian corporate law rescue regimes in the context of those originally contemplated by Sir Kenneth Cork and more latterly in Australia, primarily in the hands of Ron Harmer. In doing so, it draws upon some of the observations made by Professor Fletcher in the second wave of 20th century corporate rescue reform in the United Kingdom.
Resumo:
Background: The Mycobacterium leprae genome has less than 50% coding capacity and 1,133 pseudogenes. Preliminary evidence suggests that some pseudogenes are expressed. Therefore, defining pseudogene transcriptional and translational potentials of this genome should increase our understanding of their impact on M. leprae physiology. Results: Gene expression analysis identified transcripts from 49% of all M. leprae genes including 57% of all ORFs and 43% of all pseudogenes in the genome. Transcribed pseudogenes were randomly distributed throughout the chromosome. Factors resulting in pseudogene transcription included: 1) co-orientation of transcribed pseudogenes with transcribed ORFs within or exclusive of operon-like structures; 2) the paucity of intrinsic stem-loop transcriptional terminators between transcribed ORFs and downstream pseudogenes; and 3) predicted pseudogene promoters. Mechanisms for translational ``silencing'' of pseudogene transcripts included the lack of both translational start codons and strong Shine-Dalgarno (SD) sequences. Transcribed pseudogenes also contained multiple ``in-frame'' stop codons and high Ka/Ks ratios, compared to that of homologs in M. tuberculosis and ORFs in M. leprae. A pseudogene transcript containing an active promoter, strong SD site, a start codon, but containing two in frame stop codons yielded a protein product when expressed in E. coli. Conclusion: Approximately half of M. leprae's transcriptome consists of inactive gene products consuming energy and resources without potential benefit to M. leprae. Presently it is unclear what additional detrimental affect(s) this large number of inactive mRNAs has on the functional capability of this organism. Translation of these pseudogenes may play an important role in overall energy consumption and resultant pathophysiological characteristics of M. leprae. However, this study also demonstrated that multiple translational ``silencing'' mechanisms are present, reducing additional energy and resource expenditure required for protein production from the vast majority of these transcripts.
Resumo:
Homozygosity has long been associated with rare, often devastating, Mendelian disorders1, and Darwin was one of the first to recognize that inbreeding reduces evolutionary fitness2. However, the effect of the more distant parental relatedness that is common in modern human populations is less well understood. Genomic data now allow us to investigate the effects of homozygosity on traits of public health importance by observing contiguous homozygous segments (runs of homozygosity), which are inferred to be homozygous along their complete length. Given the low levels of genome-wide homozygosity prevalent in most human populations, information is required on very large numbers of people to provide sufficient power3, 4. Here we use runs of homozygosity to study 16 health-related quantitative traits in 354,224 individuals from 102 cohorts, and find statistically significant associations between summed runs of homozygosity and four complex traits: height, forced expiratory lung volume in one second, general cognitive ability and educational attainment (P < 1 × 10−300, 2.1 × 10−6, 2.5 × 10−10 and 1.8 × 10−10, respectively). In each case, increased homozygosity was associated with decreased trait value, equivalent to the offspring of first cousins being 1.2 cm shorter and having 10 months’ less education. Similar effect sizes were found across four continental groups and populations with different degrees of genome-wide homozygosity, providing evidence that homozygosity, rather than confounding, directly contributes to phenotypic variance. Contrary to earlier reports in substantially smaller samples5, 6, no evidence was seen of an influence of genome-wide homozygosity on blood pressure and low density lipoprotein cholesterol, or ten other cardio-metabolic traits. Since directional dominance is predicted for traits under directional evolutionary selection7, this study provides evidence that increased stature and cognitive function have been positively selected in human evolution, whereas many important risk factors for late-onset complex diseases may not have been.
Resumo:
This one-day workshop brings together researchers and practitioners to share knowledge and practices on how people can connect and interact with the Internet of Things in a playful way. Open to participants with a diverse range of interests and expertise, and by exploring novel ways to playfully connect people through their everyday objects and activities, the workshop will facilitate discussion across a range of HCI discipline areas. The outcomes from the workshop will include an archive of participants' initial position papers along with the materials created during the workshop. The result will be a road map to support the development of a Model of Playful Connectedness, focusing on how best to design and make playful networks of things, identifying the challenges that need to be addressed in order to do so.
Resumo:
The magnetic field of the Earth is 99 % of the internal origin and generated in the outer liquid core by the dynamo principle. In the 19th century, Carl Friedrich Gauss proved that the field can be described by a sum of spherical harmonic terms. Presently, this theory is the basis of e.g. IGRF models (International Geomagnetic Reference Field), which are the most accurate description available for the geomagnetic field. In average, dipole forms 3/4 and non-dipolar terms 1/4 of the instantaneous field, but the temporal mean of the field is assumed to be a pure geocentric axial dipolar field. The validity of this GAD (Geocentric Axial Dipole) hypothesis has been estimated by using several methods. In this work, the testing rests on the frequency dependence of inclination with respect to latitude. Each combination of dipole (GAD), quadrupole (G2) and octupole (G3) produces a distinct inclination distribution. These theoretical distributions have been compared with those calculated from empirical observations from different continents, and last, from the entire globe. Only data from Precambrian rocks (over 542 million years old) has been used in this work. The basic assumption is that during the long-term course of drifting continents, the globe is sampled adequately. There were 2823 observations altogether in the paleomagnetic database of the University of Helsinki. The effect of the quality of observations, as well as the age and rocktype, has been tested. For comparison between theoretical and empirical distributions, chi-square testing has been applied. In addition, spatiotemporal binning has effectively been used to remove the errors caused by multiple observations. The modelling from igneous rock data tells that the average magnetic field of the Earth is best described by a combination of a geocentric dipole and a very weak octupole (less than 10 % of GAD). Filtering and binning gave distributions a more GAD-like appearance, but deviation from GAD increased as a function of the age of rocks. The distribution calculated from so called keypoles, the most reliable determinations, behaves almost like GAD, having a zero quadrupole and an octupole 1 % of GAD. In no earlier study, past-400-Ma rocks have given a result so close to GAD, but low inclinations have been prominent especially in the sedimentary data. Despite these results, a greater deal of high-quality data and a proof of the long-term randomness of the Earth's continental motions are needed to make sure the dipole model holds true.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform’s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
As the virtual world grows more complex, finding a standard way for storing data becomes increasingly important. Ideally, each data item would be brought into the computer system only once. References for data items need to be cryptographically verifiable, so the data can maintain its identity while being passed around. This way there will be only one copy of the users family photo album, while the user can use multiple tools to show or manipulate the album. Copies of users data could be stored on some of his family members computer, some of his computers, but also at some online services which he uses. When all actors operate over one replicated copy of the data, the system automatically avoids a single point of failure. Thus the data will not disappear with one computer breaking, or one service provider going out of business. One shared copy also makes it possible to delete a piece of data from all systems at once, on users request. In our research we tried to find a model that would make data manageable to users, and make it possible to have the same data stored at various locations. We studied three systems, Persona, Freenet, and GNUnet, that suggest different models for protecting user data. The main application areas of the systems studied include securing online social networks, providing anonymous web, and preventing censorship in file-sharing. Each of the systems studied store user data on machines belonging to third parties. The systems differ in measures they take to protect their users from data loss, forged information, censorship, and being monitored. All of the systems use cryptography to secure names used for the content, and to protect the data from outsiders. Based on the gained knowledge, we built a prototype platform called Peerscape, which stores user data in a synchronized, protected database. Data items themselves are protected with cryptography against forgery, but not encrypted as the focus has been disseminating the data directly among family and friends instead of letting third parties store the information. We turned the synchronizing database into peer-to-peer web by revealing its contents through an integrated http server. The REST-like http API supports development of applications in javascript. To evaluate the platform s suitability for application development we wrote some simple applications, including a public chat room, bittorrent site, and a flower growing game. During our early tests we came to the conclusion that using the platform for simple applications works well. As web standards develop further, writing applications for the platform should become easier. Any system this complex will have its problems, and we are not expecting our platform to replace the existing web, but are fairly impressed with the results and consider our work important from the perspective of managing user data.
Resumo:
Tutkimus käsittelee yhteiskunnallisissa liikkeissä toimivia yksilöitä eli aktivisteja. Tutkimuksen juuret ovat makrotasolla, yhteiskunnallisista liikkeistä käydyssä keskustelussa. Toisaalta tutkimuksen keskeisin huomio kohdistuu mikrotasolle, yksilöön ja toimijoihin. Tutkimuksessa tarkastellaan aktivismin nivoutumista henkilökohtaiseen elämään sekä aktivismin jatkumisen ja päättymisen problematiikkaa. Aineiston tulkinta perustuu verkostonäkökulmaan, joka yhdistää subjektiiviset ja objektiiviset merkitykset sekä näkökulman pohjalta kehitettyyn elämänalahierarkian käsitteeseen (Passy & Giugni 2000). Elämänalahierarkia yhdistetään Weberin (2009) näkemykseen siitä, millaisia ominaisuuksia poliittinen toiminta vaatii. Näiden ominaisuuksien pohjalta argumentoidaan, että poliittista toimintaa, kuten aktivismia, voidaan tulkita eetoksen käsitteen avulla. Tästä teoriasynteesistä johdettujen tutkimuskysymysten avulla etsitään vastauksia siihen, kuinka aktivistin ura jatkuu tai päättyy. Yksilölähtöisyydestään huolimatta tutkimus sidotaan lopulta myös yleisemmän tason analyysiin. Tutkimus on tyypiltään laadullinen seurantatutkimus. Tutkimuksen aineisto muodostuu Ari Rasimuksen (2006) vuosina 2000 ja 2001 suorittamista yhdestätoista teemahaastattelusta sekä kuudesta uusintahaastattelusta vuodelta 2010. Haastateltavat ovat aktivisteja, jotka 1990-luvulla toimivat Oikeutta eläimille ja Maan Ystävät –järjestöissä. Vertailuasetelman vuoksi keskeinen teema tutkimuksessa on muutos ja sen selittäminen. Yksikään haastateltavista ei toiminut enää liikkeissä. Toiminnasta poisjääminen oli tapahtunut vähitellen, hiipumalla. Aktivismi, perinteisessä mielessä ymmärrettynä, oli siis taaksejäänyttä elämää. Aktivistien asenteet olivat kuitenkin lähes muuttumattomia. Osalla aktivismi oli nivoutunut työuraan, osalla taas ei. Tutkimuksessa havaittiin, että eetoksen käsite selittää sitä, miksi aktivismi ohjaa yhä elämää, vaikka ura kansalaisliikkeessä päättyisikin. Toisaalta eetoksen käsite auttaa ymmärtämään aktivismista luopumista. Haastateltavat jaettiinkin eetoksen käsitteen avulla kahteen ryhmään: luopujiin ja pysyjiin. Luopujien luopuminen selittyi eetoksesta irtisanoutumisella, kun taas pysymistä perusteli aktivismin muuntuminen työksi ja sitä kautta ylläpidetty eetos. Tutkimuksen keskeisin teoreettinen anti on luopumista selittävän elämänalahierarkiateorian syventäminen. Tutkimuksessa argumentoidaan sen puolesta, että aktivismin päättymistä on mahdollista tarkastella laajemmin kuin pelkkänä kuulumisena järjestöön tai yhteiskunnalliseen liikkeeseen. On pohdittava muun muassa sitä, kuinka aktivismi nivoutuu erilaisiin elämäntapavalintoihin. Eri elämänalojen vaikutuksia toisiinsa tulee myös tarkastella syvemmin kuin pelkän arvojärjestyksen valossa. Yhdistämällä Weberin (2009) ajatuksia poliitikoiden suotavista ominaisuuksista elämänalahierarkiateoriaan voidaan päästä tulkintaan, jossa kiinnittyminen asiakysymyksiin ja päämääriin (päämäärärationaalisuus) ennustaisi toiminnassa pysymistä, etenkin suhteessa verkostolliseen ja tunteelliseen kiinnittymiseen. Tutkimuksessa havaitaan, että aktivistit ovat integroituneet osaksi yhteiskuntaa ja etenkin pysyjät ovat myös puolueiden toiminnassa mukana. Tämän pohjalta kysytään, olisiko yhteiskunnan kannalta rationaalisempaa suhtautua liikkeisiin pelokkuuden sijaan kiinnostuksella.
Resumo:
The study analyzes the effort to build political legitimacy in the Republic of Turkey by ex-ploring a group of influential texts produced by Kemalist writers. The study explores how the Kemalist regime reproduced certain long-lasting enlightenment meta-narrative in its effort to build political legitimacy. Central in this process was a hegemonic representation of history, namely the interpretation of the Anatolian Resistance Struggle of 1919 1922 as a Turkish Revolution executing the enlightenment in the Turkish nation-state. The method employed in the study is contextualizing narratological analysis. The Kemalist texts are analyzed with a repertoire of concepts originally developed in the theory of narra-tive. By bringing these concepts together with epistemological foundations of historical sciences, the study creates a theoretical frame inside of which it is possible to highlight how initially very controversial historical representations in the end manage to construct long-lasting, emotionally and intellectually convincing bases of national identity for the secular middle classes in Turkey. The two most important explanatory concepts in this sense are di-egesis and implied reader. The diegesis refers to the ability of narrative representation to create an inherently credible story-world that works as the basis of national community. The implied reader refers to the process where a certain hegemonic narrative creates a formula of identification and a position through which any individual real-world reader of a story can step inside the narrative story-world and identify oneself as one of us of the national narra-tive. The study demonstrates that the Kemalist enlightenment meta-narrative created a group of narrative accruals which enabled generations of secular middle classes to internalize Kemalist ideology. In this sense, the narrative in question has not only worked as a tool utilized by the so-called Kemalist state-elite to justify its leadership, but has been internalized by various groups in Turkey, working as their genuine world-view. It is shown in the study that secular-ism must be seen as the core ingredient of these groups national identity. The study proposes that the enlightenment narrative reproduced in the Kemalist ideology had its origin in a simi-lar totalizing cultural narrative created in and for Europe. Currently this enlightenment project is challenged in Turkey by those who are in an attempt to give religion a greater role in Turkish society. The study argues that the enduring practice of legitimizing political power through the enlightenment meta-narrative has not only become a major factor contributing to social polarization in Turkey, but has also, in contradiction to the very real potentials for crit-ical approaches inherent in the Enlightenment tradition, crucially restricted the development of critical and rational modes of thinking in the Republic of Turkey.
Resumo:
We present a new Hessian estimator based on the simultaneous perturbation procedure, that requires three system simulations regardless of the parameter dimension. We then present two Newton-based simulation optimization algorithms that incorporate this Hessian estimator. The two algorithms differ primarily in the manner in which the Hessian estimate is used. Both our algorithms do not compute the inverse Hessian explicitly, thereby saving on computational effort. While our first algorithm directly obtains the product of the inverse Hessian with the gradient of the objective, our second algorithm makes use of the Sherman-Morrison matrix inversion lemma to recursively estimate the inverse Hessian. We provide proofs of convergence for both our algorithms. Next, we consider an interesting application of our algorithms on a problem of road traffic control. Our algorithms are seen to exhibit better performance than two Newton algorithms from a recent prior work.
Resumo:
A comprehensive study of the stress release and structural changes caused by postdeposition thermal annealing of tetrahedral amorphous carbon (ta-C) on Si has been carried out. Complete stress relief occurs at 600-700°C and is accompanied by minimal structural modifications, as indicated by electron energy loss spectroscopy, Raman spectroscopy, and optical gap measurements. Further annealing in vacuum converts sp3 sites to sp2 with a drastic change occurring after 1100°C. The field emitting behavior is substantially retained up to the complete stress relief, confirming that ta-C is a robust emitting material. © 1999 American Institute of Physics.
Resumo:
A compact electron cyclotron wave resonance (ECWR) source has been developed for the high rate deposition of hydrogenated tetrahedral amorphous carbon (ta-C:H). The ECWR provides growth rates of up to 1.5 nm/s over a 4-inch diameter and an independent control of the deposition rate and ion energy. The ta-C:H was deposited using acetylene as the source gas and was characterized as having an sp3 content of up to 77%, plasmon energy of 27 eV, refractive index of 2.45, hydrogen content of about 30%, optical gap of up to 2.1 eV and RMS surface roughness of 0.04 nm. © 1999 Elsevier Science S.A. All rights reserved.
Resumo:
A compact electron cyclotron wave resonance (ECWR) source has been developed for the high rate deposition of hydrogenated tetrahedral amorphous carbon (ta-C:H). The ECWR provides growth rates of up to 900 angstrom/min and an independent control of the deposition rate and ion energy. The ta-C:H was deposited using acetylene as the source gas and was characterized in terms of its bonding, stress and friction coefficient. The results indicated that the ta-C:H produced using this source fulfills the necessary requirements for applications requiring enhanced tribological performance.