993 resultados para Analyse configurationnelle
Resumo:
The comments I make are based on my nearly twenty years involvement in the dementia cause at both a national and international level. In preparation, I read two papers namely the Ministerial Dementia Forum – Option Paper produced by KPMG Management Consultants (2014) and Analysis of Dementia Programmes and Services Funded by the Department of Social Services: Conversation Starter prepared by KPMG as a preparation document for those attending a workshop in Brisbane on April 22nd 2015. Dementia is a complex “syndrome” and as is often said, “when you meet one person with dementia, you have met one” meaning that no two persons with dementia are the same. Even in dementia care, Australia is a “lucky country” and there is much to be said for the quality and diversity of dementia care available for people living with dementia. Despite this, I agree with the many views expressed in the material I read that there is scope for improvement, especially in the way that services are coordinated. In saying that, I do not purport to have all the solutions nor claim to have the knowledge required to comment on all the programs covered by this review. If I appear to be a “biased” advocate for Alzheimer’s Australia across the States and Territories, it is because I have seen constant evidence of ordinary people doing extraordinary things with inadequate resources. Dementia care is not cheap and if those funding dementia services are primarily only interested in economic outcomes and benefits, the real purpose of this consultation will be defeated. In addition, nowhere in the material I have read is there any recognition that in many instances program funding is a complex mix of government (at all levels) and private funding. This makes reviewing those programs more complex and less able to be coordinated at a Departmental level. It goes without saying therefore that the Federal Government is not” the only player in this game”. Of all those participating in this review, Alzheimer’s Australia is best placed to comment on programs as it is more connected to people living with dementia and has probably the best record of consulting with them. It would appear however that their role has been reduced to that of a “bit player”. Without wanting to be critical, the Forum Report which deals with the comments made at a gathering of 70 individuals and organisations, only three (3) or 4.28% were actual carers of people living with dementia. Even if it is argued that a number of organisations present represented consumers, the percentage goes up only marginally to 8.57% which is hardly an endorsement of the forum being “consumer driven”. The predominance of those present were service providers, each with their own agenda and each seeking advantage for their “business”. The final point I want to make before commenting on more specific, program related issues, is that many programs being reviewed have a much longer history than is reflected in the material I have read. Their growth and development was pioneered by Alzheimer’s Australia organisations across the country often with no government funding. Attempts to bring about better coordination of programs were often at the behest of Alzheimer’s Australia but in the main were ignored. The opportunity to now put this right is long overdue.
Resumo:
INTRODUCTION The dimensions of the thoracic intervertebral foramen in adolescent idiopathic scoliosis (AIS) have not previously been quantified. During posterior approach scoliosis correction surgery pedicle screws may occasionally breach into the foramen. Better understanding of the dimensions of the foramen may be useful in surgical planning. This study describes a reproducible method for measurement of the thoracic foramen in AIS using computerized tomography (CT). METHODS In 23 pre-operative female patients with Lenke 1 type AIS with right side convexity major curves confined to the thoracic spine the foraminal height (FH), foraminal width (FW), pedicle to superior articular process distance (P-SAP) and cross sectional foraminal area (FA) were measured using multiplanar reconstructed CT. Measurements were made at entrance, midpoint and exit of the thoracic foramina from T1/T2 to T11/T12. Results were correlated with potential dependent variables of major curve Cobb Angle measured on X-ray and CT, Age, Weight, Lenke classification subtype, Risser Grade and number of spinal levels in the major curve. RESULTS The FH, FW, P-SAP and FA dimensions and ratios are all significantly larger on the convexity of the major curve and maximal at or close to the apex. Mean thoracic foraminal dimensions change in a predictable manner relative to position on the major thoracic curve. There was no significant correlation with the measured foraminal dimensions or ratios and the potential dependent variables. The average ratio of convexity to concavity dimensions at the apex foramina for entrance, midpoint and exit respectively are FH (1.50, 1.38, 1.25), FW (1.28, 1.30, 0.98), FA (2.06, 1.84, 1.32), P-SAP (1.61, 1.47, 1.30). CONCLUSION Foraminal dimensions of the thoracic spine are significantly affected by AIS. Foraminal dimensions have a predictable convexity to concavity ratio relative to the proximity to the major curve apex. Surgeons should be aware of these anatomical differences during scoliosis correction surgery.
Resumo:
This dissertation deals with the terminology of the Euro currency. Its aims are to determine the characteristics of the designations in a certain LSP and to discover whether the recommendations and rules that have been given to the formation of designations and 'ideal' designations have any influence on the usage of the designations. The characteristics analysed include length of the designation, part of speech, form, formation method, constancy, monosemy, suitability to a concept system and degree of specialty. The study analyses the actual usage of the designations in texts and the implementation of the designations. The study is an adaptation of a terminometric survey and uses concept analysis and quantitative analysis as its basic methods. The frequency of each characteristic is measured in terms of statistics. It is assumed that the 'ideality' of a designation influences its usage, for example that if a designation is short, it is used more than its longer rivals (synonyms). The results are analysed in a corpus consisting of a compilation of different texts concerning the Euro. The corpus is divided according to three features: year (1998-2003), genre (judicial texts, annual reports and brochures) and language (Finnish and German). Each analysis is performed according to each of these features and compared with the others. The results indicate that some of the characteristics of the designations indeed seem to have an influence on the usage of the designations. For example, monosemy and suitability to the concept system often lead to the implementation of the designation having the ideal or certain value in these characteristics in the analysed Finnish material. In German material, an 'ideal' value in the characteristics leads to the implementation of the designations more often than in Finnish. The contrastive study indicates that, for example, suitability to a concept system leads to implementation of a designation in judicial texts more often than in other genres. The recommendations given to an 'ideal' designation are thus often acceptable, but they cannot be generalized for all languages in the same extent.
Resumo:
Valency Realization in Short Excerpts of News Text. A Pragmatics-funded analysis This dissertation is a study of the so-called pragmatic valency. The aim of the study is to examine the phenomenon both theoretically by discussing the research literature and empirically based on evidence from a text corpus consisting of 218 short excerpts of news text from the German newspaper Frankfurter Allgemeine Zeitung. In the theoretical part of the study, the central concepts of the valency and the pragmatic valency are discussed. In the research literature, the valency denotes the relation among the verb and its obligatory and optional complements. The pragmatic valency can be defined as modification of the so-called system valency in the parole, including non-realization of an obligatory complement, non- realization of an optional complement and realization of an optional complement. Furthermore, the investigation of the pragmatic valency includes the role of the adjuncts, elements that are not defined by the valency, in the concrete valency realization. The corpus study investigates the valency behaviour of German verbs in a corpus of about 1500 sentences combining the methodology and concepts of valency theory, semantics and text linguistics. The analysis is focused on the about 600 sentences which show deviations from the system valency, providing over 800 examples for the modification of the system valency as codified in the (valency) dictionaries. The study attempts to answer the following primary question: Why is the system valency modified in the parole? To answer the question, the concept of modification types is entered. The modification types are recognized using distinctive feature bundles in which each feature with a negative or a positive value refers to one reason for the modification treated in the research literature. For example, the features of irrelevance and relevance, focus, world and text type knowledge, text theme, theme-rheme structure and cohesive chains are applied. The valency approach appears in a new light when explored through corpus-based investigation; both the optionality of complements and the distinction between complements and adjuncts as defined in the present valency approach seem in some respects defective. Furthermore, the analysis indicates that the adjuncts outside the valency domain play a central role in the concrete realization of the valency. Finally, the study suggests a definition of pragmatic valency, based on the modification types introduced in the study and tested in the corpus analysis.
Resumo:
Human activities extract and displace different substances and materials from the earth s crust, thus causing various environmental problems, such as climate change, acidification and eutrophication. As problems have become more complicated, more holistic measures that consider the origins and sources of pollutants have been called for. Industrial ecology is a field of science that forms a comprehensive framework for studying the interactions between the modern technological society and the environment. Industrial ecology considers humans and their technologies to be part of the natural environment, not separate from it. Industrial operations form natural systems that must also function as such within the constraints set by the biosphere. Industrial symbiosis (IS) is a central concept of industrial ecology. Industrial symbiosis studies look at the physical flows of materials and energy in local industrial systems. In an ideal IS, waste material and energy are exchanged by the actors of the system, thereby reducing the consumption of virgin material and energy inputs and the generation of waste and emissions. Companies are seen as part of the chains of suppliers and consumers that resemble those of natural ecosystems. The aim of this study was to analyse the environmental performance of an industrial symbiosis based on pulp and paper production, taking into account life cycle impacts as well. Life Cycle Assessment (LCA) is a tool for quantitatively and systematically evaluating the environmental aspects of a product, technology or service throughout its whole life cycle. Moreover, the Natural Step Sustainability Principles formed a conceptual framework for assessing the environmental performance of the case study symbiosis (Paper I). The environmental performance of the case study symbiosis was compared to four counterfactual reference scenarios in which the actors of the symbiosis operated on their own. The research methods used were process-based life cycle assessment (LCA) (Papers II and III) and hybrid LCA, which combines both process and input-output LCA (Paper IV). The results showed that the environmental impacts caused by the extraction and processing of the materials and the energy used by the symbiosis were considerable. If only the direct emissions and resource use of the symbiosis had been considered, less than half of the total environmental impacts of the system would have been taken into account. When the results were compared with the counterfactual reference scenarios, the net environmental impacts of the symbiosis were smaller than those of the reference scenarios. The reduction in environmental impacts was mainly due to changes in the way energy was produced. However, the results are sensitive to the way the reference scenarios are defined. LCA is a useful tool for assessing the overall environmental performance of industrial symbioses. It is recommended that in addition to the direct effects, the upstream impacts should be taken into account as well when assessing the environmental performance of industrial symbioses. Industrial symbiosis should be seen as part of the process of improving the environmental performance of a system. In some cases, it may be more efficient, from an environmental point of view, to focus on supply chain management instead.
Resumo:
Equilibrium thermodynamic analysis has been applied to the low-pressure MOCVD process using manganese acetylacetonate as the precursor. ``CVD phase stability diagrams'' have been constructed separately for the processes carried out in argon and oxygen ambient, depicting the compositions of the resulting films as functions of CVD parameters. For the process conduced in argon ambient, the analysis predicts the simultaneous deposition of MnO and elemental carbon in 1: 3 molar proportion, over a range of temperatures. The analysis predicts also that, if CVD is carried out in oxygen ambient, even a very low flow of oxygen leads to the complete absence of carbon in the film deposited oxygen, with greater oxygen flow resulting in the simultaneous deposition of two different manganese oxides under certain conditions. The results of thermodynamic modeling have been verified quantitatively for low-pressure CVD conducted in argon ambient. Indeed, the large excess of carbon in the deposit is found to constitute a MnO/C nanocomposite, the associated cauliflower-like morphology making it a promising candidate for electrode material in supercapacitors. CVD carried out in oxygen flow, under specific conditions, leads to the deposition of more than one manganese oxide, as expected from thermodynamic analysis ( and forming an oxide-oxide nanocomposite). These results together demonstrate that thermodynamic analysis of the MOCVD process can be employed to synthesize thin films in a predictive manner, thus avoiding the inefficient trial-and-error method usually associated with MOCVD process development. The prospect of developing thin films of novel compositions and characteristics in a predictive manner, through the appropriate choice of CVD precursors and process conditions, emerges from the present work.
Resumo:
NMR spectra of molecules oriented in liquid-crystalline matrix provide information on the structure and orientation of the molecules. Thermotropic liquid crystals used as an orienting media result in the spectra of spins that are generally strongly coupled. The number of allowed transitions increases rapidly with the increase in the number of interacting spins. Furthermore, the number of single quantum transitions required for analysis is highly redundant. In the present study, we have demonstrated that it is possible to separate the subspectra of a homonuclear dipolar coupled spin system on the basis of the spin states of the coupled heteronuclei by multiple quantum (MQ)−single quantum (SQ) correlation experiments. This significantly reduces the number of redundant transitions, thereby simplifying the analysis of the complex spectrum. The methodology has been demonstrated on the doubly 13C labeled acetonitrile aligned in the liquid-crystal matrix and has been applied to analyze the complex spectrum of an oriented six spin system.
Resumo:
A new method is presented here to analyse the Peierls-Nabarro model of an edge dislocation in a rectangular plate. The analysis is based on the superposition scheme and series expansions of complex potentials. The stress field and dislocation density field on the slip plane can be expressed as the first and the second Chebyshev polynomial series respectively. Two sets of governing equations are obtained on the slip plane and outer boundary of the rectangular plate respectively. Three numerical methods are used to solve the governing equations.
Resumo:
Trials for the determination of the magnitude of bycatch reduction by sorting grids used in the commercial brown shrimp fishery were carried out from September to December 1997. Trawls with 9 m beam length were used on different fishing grounds in the estuary of the Elbe River near Cuxhaven. The sorting grids tested were made of stainless steel bars spaced at 18, 20, 22, 26 and 30 mm, built into a cylindrical stainless steel frame with a diameter of 65 cm at an angle of attack of 45 degrees. This frame was positioned between the forenet and codend. Simultaneous hauls were made with a trawl of equal construction but without a sorting grid, and the weighed catch components (fish, discard shrimps and commercial size shrimps) separated by means of a riddle were compared. The composition of the sorted out part of the catch of the sorting grid net could be calculated by comparise the corresponding catch components in both the standard trawl and the sorting grid trawl. According to this the total catch of the beam trawl with the sorting grid is reduced by 18 to 38 % depending on the space between the bars. 7 to 31 % of the sorted out part of the catch consists of fish. The use of the sorting grid, however, also leads to losses of 4 to 12 % in Oktober. Per hour of towing this means a loss of 10,3 % commerical size shrimps with a sorting grid of 18 mm space between the bars and of 12,4 % for a 26 mm grid.
Resumo:
Trials for the determination of the magnitude of bycatch reduction by the sievenets used in the commercial brown shrimp fishery were carried out from September to December 1997. Trawls with 9 m beam length were the subjected to the investigation. They were used on different fishing grounds in the estuary of the Elbe near Cuxhaven. Sievenets with 50, 60 and 80 mm mesh opening were tested in 29 hauls and 31.6 h total duration. A trawl of equal construction without sievenets fished synchronously was used for comparison. The proportional catch composition in the codend was determined by weighing the catch components (fish, discard shrimps and commercial size shrimps) as separated by means of a riddle. The composition of the sorted out part of the catch could be calculated by comparison of the corresponding catch components both in standard trawl and sievenettrawl. According to this the total catch of a beam trawl with sievenet is diminished by 9 to 34 % depending on the mesh opening of the sievenet. 32 to 58 % of the sorted out part of the catch consists of fish. Use of a sievenet, however, also leads to a loss of 6 to 15 % of commercial size shrimps. Per hour of towing this means a loss of 8.7 kg commercial size shrimps with a sievenet of 60 mm mesh opening and of 1.8 kg for a mesh size of 80 mm.
Resumo:
Raw or smoked eel was analysed by isoelectric focusing of sarcoplasmic proteins. For raw fish specific protein patterns were obtained for A. anguilla/A. rostrata, A. japonica and A. australis, but in case of smoked fish differentiation was only possible between Atlantic and Pacific species. Differentiation of raw or smoked eel was possible by PCR-SSCP, but patterns of ssDNA showed some intra-specific variability depending on the type of amplicon.
Resumo:
Deutscher Caviar, made from roe of lumpfish or capelin, gives species specific patterns in protein electrophoresis. The same techniques can be used to differentiate caviar from salmon and trout. The differentiation of sturgeon caviar (beluga, osietra, sevruga) is possible by isoelectric focusing, but not by SDS-PAGE. PCR-based methods of DNA-analysis for identification of the origin of sturgeon caviar are under development.