465 resultados para ent kaurane diterpene


Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the variety of PV inverter types and the number of transformerless PV inverters on the Australian market increasing, we revisit some of the issues associated with these topologies. A recent electric shock incident in Queensland (luckily without serious outcome) associated with a transformerless PV system, highlights the need for earthing PV array structures and PV module frames to prevent capacitive leakage currents causing electric shock. The presented test results of the relevant voltages associated with leakage currents of five transformerless PV inverters stress this requirement, which is currently being addressed by both the Clean Energy Council and Standards Australia. DC current injection tests were performed on the same five inverters and were used to develop preliminary recommendations for a more meaningful DC current test procedure for AS4777 Part 2. The test circuit, methodology and results are presented and discussed. A notable temperature dependency of DC current injections with three of the five inverters suggests that DC current injection should be tested at high and low internal inverter temperatures whereas the power dependency noted only for one inverter does not seem to justify recommendations for a (rather involved) standard test procedure at different power levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The highly complex structure of the human brain is strongly shaped by genetic influences. Subcortical brain regions form circuits with cortical areas to coordinate movement, learning, memory and motivation, and altered circuits can lead to abnormal behaviour and disease. To investigate how common genetic variants affect the structure of these brain regions, here we conduct genome-wide association studies of the volumes of seven subcortical regions and the intracranial volume derived from magnetic resonance images of 30,717 individuals from 50 cohorts. We identify five novel genetic variants influencing the volumes of the putamen and caudate nucleus. We also find stronger evidence for three loci with previously established influences on hippocampal volume and intracranial volume. These variants show specific volumetric effects on brain structures rather than global effects across structures. The strongest effects were found for the putamen, where a novel intergenic locus with replicable influence on volume (rs945270; P = 1.08×10 -33; 0.52% variance explained) showed evidence of altering the expression of the KTN1 gene in both brain and blood tissue. Variants influencing putamen volume clustered near developmental genes that regulate apoptosis, axon guidance and vesicle transport. Identification of these genetic variants provides insight into the causes of variability in human brain development, and may help to determine mechanisms of neuropsychiatric dysfunction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This issue on the genetics of brain imaging phenotypes is a celebration of the happy marriage between two of science's highly interesting fields: neuroscience and genetics. The articles collected here are ample evidence that a good deal of synergy exists in this marriage. A wide selection of papers is presented that provide many different perspectives on how genes cause variation in brain structure and function, which in turn influence behavioral phenotypes (including psychopathology). They are examples of the many different methodologies in contemporary genetics and neuroscience research. Genetic methodology includes genome-wide association (GWA), candidate-gene association, and twin studies. Sources of data on brain phenotypes include cortical gray matter (GM) structural/volumetric measures from magnetic resonance imaging (MRI); white matter (WM) measures from diffusion tensor imaging (DTI), such as fractional anisotropy; functional- (activity-) based measures from electroencephalography (EEG), and functional MRI (fMRI). Together, they reflect a combination of scientific fields that have seen great technological advances, whether it is the single-nucleotide polymorphism (SNP) array in genetics, the increasingly high-resolution MRI imaging, or high angular resolution diffusion imaging technique for measuring WM connective properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Joseph Brodsky, one of the most influential Russian intellectuals of the late Soviet period, was born in Leningrad in 1940, emigrated to the United States in 1972, received the Nobel Prize for Literature in 1987, and died in New York City in 1996. Brodsky was one of the leading public figures of Soviet emigration in the Cold War period, and his role as a model for the constructing of Russian cultural identities in the last years of the Soviet Union was, and still is, extremely important. One of Joseph Brodsky’s great contributions to Russian culture of the latter half of the twentieth century is the wide geographical scope of his poetic and prose works. Brodsky was not a travel writer, but he was a traveling writer who wrote a considerable number of poems and essays which relate to his trips and travels in the Soviet empire and outside it. Travel writing offered for Brodsky a discursive space for negotiating his own transculturation, while it also offered him a discursive space for making powerful statements about displacement, culture, history and geography, time and space—all major themes of his poetry. In this study of Joseph Brodsky’s travel writing I focus on his travel texts in poetry and prose, which relate to his post-1972 trips to Mexico, Brazil, Turkey, and Venice. Questions of empire, tourism, and nostalgia are foregrounded in one way or another in Brodsky’s travel writing performed in emigration. I explore these concepts through the study of tropes, strategies of identity construction, and the politics of representation. The theoretical premises of my work draw on the literary and cultural criticism which has evolved around the study of travel and travel writing in recent years. These approaches have gained much from the scholarly experience provided by postcolonial critique. Shifting the focus away from the concept of exile, the traditional framework for scholarly discussions of Brodsky’s works, I propose to review Brodsky’s travel poetry and prose as a response not only to his exilic condition but to the postmodern and postcolonial landscape, which initially shaped the writing of these texts. Discussing Brodsky’s travel writing in this context offers previously unexplored perspectives for analyzing the geopolitical, philosophical, and linguistic premises of his poetic imagination. By situating Brodsky’s travel writing in the geopolitical landscape of postcolonial postmodernity, I attempt to show how Brodsky’s engagement with his contemporary cultural practices in the West was incorporated into his Russian-language travel poetry and prose and how this engagement thus contributed to these texts’ status as exceptional and unique literary events within late Soviet Russian cultural practices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tolerance of Noise as a Necessity of Urban Life. Noise pollution as an environmental problem and its cultural perceptions in the city of Helsinki This study looks at the noise pollution problem and the change in the urban soundscape in the city of Helsinki during the period from the 1950s to the present day. The study investigates the formation of noise problems, the politicization of the noise pollution problem, noise-related civic activism, the development of environmental policies on noise, and the expectations that urban dwellers have had concerning their everyday soundscape. Both so-called street noise and the noise caused by, e.g., neighbors are taken into account. The study investigates whether our society contains or has for some time contained cultural and other elements that place noise pollution as an essential or normal state of affairs as part of urban life. It is also discussed whether we are moving towards an artificial soundscape, meaning that the auditory reality, the soundscape, is more and more under human control. The concept of an artificial soundscape was used to crystallize the significance of human actions and the role of modern technology in shaping soundscapes and also to link the changes in the modern soundscape to the economic, political, and social changes connected to the modernization process. It was argued that the critical period defining noise pollution as an environmental problem were the years from the end of the 1960s to the early 1970s. It seems that the massive increase of noise pollution caused by road traffic and the introduction of the utopian traffic plans was the key point that launched the moral protest against the increase of noise pollution, and in general, against the basic structures and mindsets of society, including attitudes towards nature. The study argues that after noise pollution was politicized and institutionalized, the urban soundscape gradually became the target of systematic interventions. However, for various reasons, such as the inconsistency in decision making, our increased capacity to shape the soundscape has not resulted in a healthy or pleasant urban soundscape. In fact the number of people exposed to noise pollution is increasing. It is argued that our society contains cultural and other elements that urge us to see noise as a normal part of urban life. It is also argued that the possibility of experiencing natural, silent soundscapes seems to be the yardstick against which citizens of Helsinki have measured how successful we are in designing the (artificial) soundscape and if the actions of noise control have been effective. This work discusses whose interests it serves when we are asked to accept noise pollution as a normal state of affairs. It is also suggested that the quality of the artificial soundscape ought to be radically politicized, which might give all citizens a better and more equal chance to express their needs and wishes concerning the urban soudscape, and also to decide how it ought to be designed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Matrix decompositions, where a given matrix is represented as a product of two other matrices, are regularly used in data mining. Most matrix decompositions have their roots in linear algebra, but the needs of data mining are not always those of linear algebra. In data mining one needs to have results that are interpretable -- and what is considered interpretable in data mining can be very different to what is considered interpretable in linear algebra. --- The purpose of this thesis is to study matrix decompositions that directly address the issue of interpretability. An example is a decomposition of binary matrices where the factor matrices are assumed to be binary and the matrix multiplication is Boolean. The restriction to binary factor matrices increases interpretability -- factor matrices are of the same type as the original matrix -- and allows the use of Boolean matrix multiplication, which is often more intuitive than normal matrix multiplication with binary matrices. Also several other decomposition methods are described, and the computational complexity of computing them is studied together with the hardness of approximating the related optimization problems. Based on these studies, algorithms for constructing the decompositions are proposed. Constructing the decompositions turns out to be computationally hard, and the proposed algorithms are mostly based on various heuristics. Nevertheless, the algorithms are shown to be capable of finding good results in empirical experiments conducted with both synthetic and real-world data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Finnish forest industry bought more than half of the timber used in factories and sawmills in the 1930s from non-industrial private forests (NIPF). This research investigates the rules conformed to this timber trade. The main research questions are: what were the rules that influenced the timber trade; and by whom they were set up? Attention is also paid to the factors which advanced the forest owners’ negotiation possibilities. A variety of sources were used: legal and company statutes, timber trade contracts, archives of the forest companies and organisations. Moreover, the written reminiscences collected by the Finnish Literature Society in the early 1970s were used to analyse the views of individual sellers and buyers. An institutional economics approach was applied as the theoretical framework of this study. In the timber trade the seller (forest owner) and the buyer (the employee of the forest company) agreed to the rules of the timber trade. They agreed about the amount and the price of the timber on sale, but also rules concerning, e.g., timber marking and harvesting. The forest companies had a strong control over the written contracts. Neither the private forest owners nor the forest organisations had much influence over these contracts. However, they managed to influence the rules which could not be found in the contracts. These written and unwritten rules regulated, for instance, the timber marking and measurement. The forest organisations such as Central Forestry Board Tapio (Keskusmetsäseura Tapio) and associations of forest owners (metsänhoitoyhdistykset) helped private forest owners in gaining more control over the timber marking. In timber marking, the forest owner selected trees to be included in the timber trade and gained more information, which he could use in the negotiations. The other rule, which was changed despite forest companies’ resistance, was the timber measurement. The Central Union of Agricultural Producers (MTK) negotiated with the Central Association of Finnish Woodworking Industries (SPKL) about changing the rules of the measurement practices. Even though SPKL did not support any changes, the new timber measurement law was accepted in the year 1938. The new law also created a supervisory authority to solve possible disagreements. Despite this the forest companies were still in charge of the measurement process in most cases. The private forest owners attained changes in the rules of the timber trade mainly during the 1930s. Earlier the relative weakness of the private forest organisations had diminished their negotiation positions. This changed in the 1930s as the private forest owners and their organisations became more active. At the same time the forest industry experienced a shortage of timber, especially pulp wood, and this provided the private forest owners with more leverage. Full-text (in Finnish) available at http://helda.helsinki.fi/handle/10224/4081

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Einstein's general relativity is a classical theory of gravitation: it is a postulate on the coupling between the four-dimensional, continuos spacetime and the matter fields in the universe, and it yields their dynamical evolution. It is believed that general relativity must be replaced by a quantum theory of gravity at least at extremely high energies of the early universe and at regions of strong curvature of spacetime, cf. black holes. Various attempts to quantize gravity, including conceptually new models such as string theory, have suggested that modification to general relativity might show up even at lower energy scales. On the other hand, also the late time acceleration of the expansion of the universe, known as the dark energy problem, might originate from new gravitational physics. Thus, although there has been no direct experimental evidence contradicting general relativity so far - on the contrary, it has passed a variety of observational tests - it is a question worth asking, why should the effective theory of gravity be of the exact form of general relativity? If general relativity is modified, how do the predictions of the theory change? Furthermore, how far can we go with the changes before we are face with contradictions with the experiments? Along with the changes, could there be new phenomena, which we could measure to find hints of the form of the quantum theory of gravity? This thesis is on a class of modified gravity theories called f(R) models, and in particular on the effects of changing the theory of gravity on stellar solutions. It is discussed how experimental constraints from the measurements in the Solar System restrict the form of f(R) theories. Moreover, it is shown that models, which do not differ from general relativity at the weak field scale of the Solar System, can produce very different predictions for dense stars like neutron stars. Due to the nature of f(R) models, the role of independent connection of the spacetime is emphasized throughout the thesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Societal reactions to norm breaking behavior of children reveal, how we understand childhood, the relations between generations and communitie's ratio of tolerance. In Finland the children that repeatedly commit crimes receive social service measures that are based on Child Welfare Act. In the city of Helsinki (Stadi in the slang of Helsinki) existed an agency specifically established for ill-behaving children until the 1980's, agter which an unified agency for the maltreated and maladjusted children was founded. Through five boys' welfare cases, this research aims at defining what kind of positions, social relations and structures are constructed in the social dynamics of these children's everyday lives. The cases cover different decades from the 1940s to the present. At the same time the cases reflect the child welfare and societal practices, and reveal how the communities have participated in constructing deviance in different eras. The research is meta-theoretically based on critical realism and specifically on Roy Bhaskar's transformative model of social activity. The cases are analyzed in the framework of Edwin M. Lemert's societal reaction theory. Thus the focus of the study is on the wide structural context of the institutional and societal definitions of deviance. The research is methodologically based on a qualitative multiple case study research. The primary data consist of classified child welfare case files collected from the archives of the city of Helsinki. The data of the institutional level consist of the annual reports from 1943 to 2004 and the ordinances from 1907 onwards, and of various committee documents produced in the law-making process of child welfare, youth and criminal legislation of the 20th century. Empirical finding are interpreted in a dialogue with previous historical and child welfare research, contemporary literature and studies on the urban development. The analysis is based on Derek Layder's model of adaptive theory. The research forms a viewpoint to the historical study of child welfare, in which the historical era, its agents and the dynamics of their mutual relations are studied through an individual level reconstruction based on the societal reaction theory. The case analyses reveal how the positions of the children form differently in the different eras of child welfare practices. In the 1940s the child is positioned as a psychopath and a criminal type. The measures are aimed at protecting the community from the disturbed child, and at adjusting the individual by isolation. From 1960s to 1980s the child is positioned as a child in need of help and support. The child becomes a victim, a subject that occupies rights, and a target of protection. In the turn of the millennium a norm breaking child is positioned as a dangerous individual that, in the name of the community safety, has to be confined. The case analyses also reveal the prevailing academic and practical paradigms of the time. Keywords: childhood, youth, child protection, child welfare, delinquency, crime, deviance, history, critical realism, case study research

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smoking has decreased significantly over the last few decades, but it still remains one of the most serious public health problems in all Western countries. Smoking has decreased especially in upper socioeconomic groups, and this differentiation is an important factor behind socioeconomic health differentials. The study examines smokers risk perceptions, justifications and the meaning of smoking in different occupational groups. The starting point of the research is that the concept of health behaviour and the individualistic orientation it implies is too narrow a viewpoint with which to understand the current cultural status of smoking and to explain its association with social class. The study utilizes two kinds of data. Internet discussions are used to examine smokers risk perceptions and counter-reactions to current public health discourses. Interviews of smokers and ex-smokers (N=55) from different occupations are utilized to analyse the process of giving up smoking, social class differences in the justifications of smoking and the role of smoking in manual work. The continuing popularity of smoking is not a question of lacking knowledge of or concern about health risks. Even manual workers, in whom smoking is more prevalent, consider smoking a health risk. However, smokers have several ways of dealing with the risk. They can equate it with other health risks confronted in everyday life or question the adequacy of expert knowledge. Smoking can be seen as signifying the ability to make independent decisions and to question authorities. Regardless of the self-acknowledged dependency, smoking can be understood as a choice. This seemingly contradictory viewpoint was central especially for non-manual workers. They emphasized the pleasures and rules of smoking and the management of dependency. In contrast, manual workers did not give positive justifications for their smoking, thus implying the self-evident nature of the habit. Still, smoking functions as a resource in manual work as it increases the autonomy of workers in terms of their daily tasks. At the same time, smoking is attached to other routines and practices at workplaces. The study shows that in order to understand current trends in smoking, differing perceptions of risk and health as well as ways of life and their social and economic determinants need to be taken into account. Focussing on the social contexts and environments in which smoking is most prevalent is necessary in order to explain the current association of smoking with the working class.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this work was to examine how breathing, swallowing and voicing are affected in different laryngeal disorders. For this purpose, we examined four different patient groups: patients who had undergone total laryngectomy, anterior cervical decompression (ACD), or injection laryngoplasty with autologous fascia (ILAF), and patients with dyspnea during exercise. We studied the problems and benefits related to the automatic speech valve used for the rehabilitation of speech in laryngectomized patients. The device was given to 14 total laryngectomized patients who used the traditional valve especially well. The usefulness of voice and intelligibility of speech were assessed by speech pathologists. The results demonstrated better performance with the traditional valve in both dimensions. Most of the patients considered the automatic valve a helpful additional device but because of heavier breathing and the greater work needed for speech production, it was not suitable as a sole device in speech rehabilitation. Dysphonia and dysphagia are known complications of ACD. These symptoms are caused due to the stretching of tissue needed during the surgery, but the extent and the recovery from them was not well known before our study. We studied two patient groups, an early group with 50 patients who were examined immediately before and after the surgery and a late group with 64 patients who were examined 3 9 months postoperatively. Altogether, 60% reported dysphonia and 69% dysphagia immediately after the operation. Even though dysphagia and dysphonia often appeared after surgery, permanent problems seldom occurred. Six (12 %) cases of transient and two (3 %) permanent vocal cord paresis were detected. In our third study, the long-term results of ILAF in 43 patients with unilateral vocal cord paralysis were examined. The mean follow-up was 5.8 years (range 3 10). Perceptual evaluation demonstrated improved results for voice quality, and videostroboscopy revealed complete or partial glottal closure in 83% of the patients. Fascia showed to be a stable injection material with good vocal results. In our final study we developed a new diagnostic method for exertional laryngeal dyspnea by combining a cardiovascular exercise test with simultaneous fiberoptic observation of the larynx. With this method, it is possible to visualize paradoxal closure of the vocal cords during inspiration, which is a diagnostic criterion for vocal cord dysfunction (VCD). We examined 30 patients referred to our hospital because of suspicion of exercise-induced vocal cord dysfunction (EIVCD). Twenty seven out of thirty patients were able to perform the test. Dyspnea was induced in 15 patients, and of them five had EIVCD and four high suspicion of EIVCD. With our test it is possible to set an accurate diagnosis for exertional laryngeal dyspnea. Moreover, the often seen unnecessary use of asthma drugs among these patients can be avoided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Three strategically important uses of IT in the construction industry are the storage and management of project documents on webservers (EDM), the electronic handling of orders and invoices between companies (EDI) and the use of 3-D models including non-geometrical attributes for integrated design and construction (BIM). In a broad longitudinal survey study of IT use in the Swedish Construction Industry the extent of use of these techniques was measured in 1998, 2000 and 2007. The results showed that EDM and EDI are currently already well-established techniques whereas BIM, although it promises the biggest potential benefits to the industry, only seems to be at the beginning of adoption. In a follow-up to the quantitative studies, the factors affecting the decisions to implement EDM, EDI and BIM as well as the actual adoption processes, were studied using semi-structured interviews with practitioners. The theoretical basis for the interview studies was informed by theoretical frameworks from IT-adoption theory, where in particular the UTAUT model has provided the main basis for the analyses presented here. The results showed that the decisions to take the above technologies into use are made on three differ- ent levels: the individual level, the organizational level in the form of a company, and the organiza- tional level in the form of a project. The different patterns in adoption can to some part be explained by where the decisions are mainly taken. EDM is driven from the organisation/project level, EDI mainly from the organisation/company level, and BIM is driven by individuals pioneering the technique.