993 resultados para Analyse MMG
Resumo:
Cette thse est compose de trois articles en conomie des ressources naturelles non-renouvelables. Nous considrons tour tour les questions suivantes : le prix in-situ des ressources naturelles non-renouvelables ; le taux dextraction optimal et le prix des res- sources non-renouvelables et durables. Dans le premier article, nous estimons le prix in-situ des ressources naturelles non-renouvelables en utilisant les donnes sur le cot moyen dextraction pour obtenir une approximation du cot marginal. En utilisant la Mthode des Moments Gnraliss, une dynamique du prix de march derive des conditions doptimalit du modle dHotelling est estime avec des donnes de panel de 14 ressources naturelles non-renouvelables. Nous trouvons des rsultats qui tendent soutenir le modle. Premirement, le modle dHotelling exhibe un bon pouvoir explicatif du prix de march observ. Deuximement, bien que le prix estim prsente un changement structurel dans le temps, ceci semble navoir aucun impact significatif sur le pouvoir explicatif du modle. Troisimement, on ne peut pas rejeter lhypothse que le cot marginal dextraction puisse tre approxim par les donnes sur le cot moyen. Quatrimement, le prix in-situ estim en prenant en compte les changements structurels dcrot ou exhibe une forme en U invers dans le temps et semble tre corrl positivement avec le prix de march. Cinquimement, pour neuf des quatorze ressources, la diffrence entre le prix in-situ estim avec changements structurels et celui estim en ngligeant les changements structurels est un processus de moyenne nulle. Dans le deuxime article, nous testons lexistence dun quilibre dans lequel le taux dextraction optimal des ressources non-renouvelables est linaire par rapport au stock de ressource en terre. Tout dabord, nous considrons un modle dHotelling avec une fonction de demande variant dans le temps caractrise par une lasticit prix constante et une fonction de cot dextraction variant dans le temps caractrise par des lasticits constantes par rapport au taux dextraction et au stock de ressource. Ensuite, nous mon- trons quil existe un quilibre dans lequel le taux dextraction optimal est proportionnel au stock de ressource si et seulement si le taux dactualisation et les paramtres des fonctions de demande et de cot dextraction satisfont une relation bien prcise. Enfin, nous utilisons les donnes de panel de quatorze ressources non-renouvelables pour vrifier empiriquement cette relation. Dans le cas o les paramtres du modle sont supposs invariants dans le temps, nous trouvons quon ne peut rejeter la relation que pour six des quatorze ressources. Cependant, ce rsultat change lorsque nous prenons en compte le changement structurel dans le temps des prix des ressources. En fait, dans ce cas nous trouvons que la relation est rejete pour toutes les quatorze ressources. Dans le troisime article, nous tudions lvolution du prix dune ressource naturelle non-renouvelable dans le cas o cette ressource est durable, cest--dire quune fois extraite elle devient un actif productif dtenu hors terre. On emprunte la thorie de la dtermination du prix des actifs pour ce faire. Le choix de portefeuille porte alors sur les actifs suivant : un stock de ressource non-renouvelable dtenu en terre, qui ne procure aucun service productif ; un stock de ressource dtenu hors terre, qui procure un flux de services productifs ; un stock dun bien composite, qui peut tre dtenu soit sous forme de capital productif, soit sous forme dune obligation dont le rendement est donn. Les productivits du secteur de production du bien composite et du secteur de lextraction de la ressource voluent de faon stochastique. On montre que la prdiction que lon peut tirer quant au sentier de prix de la ressource diffre considrablement de celle qui dcoule de la rgle dHotelling lmentaire et quaucune prdiction non ambigu quant au comportement du sentier de prix ne peut tre obtenue de faon analytique.
Resumo:
This is an important book that ought to launch a debate about how we research our understanding of the world, it is an innovative intervention in a vital public issue, and it is an elegant and scholarly hard look at what is actually happening. Jean Seaton, Prof of Media History, U of Westminster, UK & Official Historian of the BBC -- Summary: This book investigates the question of how comparative studies of international TV news (here: on violence presentation) can best be conceptualized in a way that allows for crossnational, comparative conclusions on an empirically validated basis. This book shows that such a conceptualization is necessary in order to overcome existing restrictions in the comparability of international analysis on violence presentation. Investigated examples include the most watched news bulletins in Great Britain (10o'clock news on the BBC), Germany (Tagesschau on ARD) and Russia (Vremja on Channel 1). This book highlights a substantial cross-national violence news flow as well as a cross-national visual violence flow (key visuals) as distinct transnational components. In addition, event-related textual analysis reveals how the historical rootedness of nations and its symbols of power are still manifested in televisual mediations of violence. In conclusion, this study lobbies for a conscientious use of comparative data/analysis both in journalism research and practice in order to understand what it may convey in the different arenas of todays newsmaking.
Resumo:
This paper deals with the analysis of the parameters which are effective in shaft voltage generation of induction generators. It focuses on different parasitic capacitive couplings by mathematical equations, finite element simulations and experiments. The effects of different design parameters have been studied on proposed capacitances and resultant shaft voltage. Some parameters can change proposed capacitive coupling such as: stator slot tooth, the gap between slot tooth and winding, and the height of the slot tooth, as well as the air gap between the rotor and the stator. This analysis can be used in a primary stage of a generator design to reduce motor shaft voltage and avoid additional costs of resultant bearing current mitigation.
Resumo:
Monitoring Internet traffic is critical in order to acquire a good understanding of threats to computer and network security and in designing efficient computer security systems. Researchers and network administrators have applied several approaches to monitoring traffic for malicious content. These techniques include monitoring network components, aggregating IDS alerts, and monitoring unused IP address spaces. Another method for monitoring and analyzing malicious traffic, which has been widely tried and accepted, is the use of honeypots. Honeypots are very valuable security resources for gathering artefacts associated with a variety of Internet attack activities. As honeypots run no production services, any contact with them is considered potentially malicious or suspicious by definition. This unique characteristic of the honeypot reduces the amount of collected traffic and makes it a more valuable source of information than other existing techniques. Currently, there is insufficient research in the honeypot data analysis field. To date, most of the work on honeypots has been devoted to the design of new honeypots or optimizing the current ones. Approaches for analyzing data collected from honeypots, especially low-interaction honeypots, are presently immature, while analysis techniques are manual and focus mainly on identifying existing attacks. This research addresses the need for developing more advanced techniques for analyzing Internet traffic data collected from low-interaction honeypots. We believe that characterizing honeypot traffic will improve the security of networks and, if the honeypot data is handled in time, give early signs of new vulnerabilities or breakouts of new automated malicious codes, such as worms. The outcomes of this research include: Identification of repeated use of attack tools and attack processes through grouping activities that exhibit similar packet inter-arrival time distributions using the cliquing algorithm; Application of principal component analysis to detect the structure of attackers activities present in low-interaction honeypots and to visualize attackers behaviors; Detection of new attacks in low-interaction honeypot traffic through the use of the principal components residual space and the square prediction error statistic; Real-time detection of new attacks using recursive principal component analysis; A proof of concept implementation for honeypot traffic analysis and real time monitoring.
Resumo:
Prostate cancer metastasis is reliant on the reciprocal interactions between cancer cells and the bone niche/micro-environment. The production of suitable matrices to study metastasis, carcinogenesis and in particular prostate cancer/bone micro-environment interaction has been limited to specific protein matrices or matrix secreted by immortalised cell lines that may have undergone transformation processes altering signaling pathways and modifying gene or receptor expression. We hypothesize that matrices produced by primary human osteoblasts are a suitable means to develop an in vitro model system for bone metastasis research mimicking in vivo conditions. We have used a decellularized matrix secreted from primary human osteoblasts as a model for prostate cancer function in the bone micro-environment. We show that this collagen I rich matrix is of fibrillar appearance, highly mineralized, and contains proteins, such as osteocalcin, osteonectin and osteopontin, and growth factors characteristic of bone extracellular matrix (ECM). LNCaP and PC3 cells grown on this matrix, adhere strongly, proliferate, and express markers consistent with a loss of epithelial phenotype. Moreover, growth of these cells on the matrix is accompanied by the induction of genes associated with attachment, migration, increased invasive potential, Ca2+ signaling and osteolysis. In summary, we show that growth of prostate cancer cells on matrices produced by primary human osteoblasts mimics key features of prostate cancer bone metastases and thus is a suitable model system to study the tumor/bone micro-environment interaction in this disease.
Resumo:
Concerns regarding groundwater contamination with nitrate and the long-term sustainability of groundwater resources have prompted the development of a multi-layered three dimensional (3D) geological model to characterise the aquifer geometry of the Wairau Plain, Marlborough District, New Zealand. The 3D geological model which consists of eight litho-stratigraphic units has been subsequently used to synthesise hydrogeological and hydrogeochemical data for different aquifers in an approach that aims to demonstrate how integration of water chemistry data within the physical framework of a 3D geological model can help to better understand and conceptualise groundwater systems in complex geological settings. Multivariate statistical techniques(e.g. Principal Component Analysis and Hierarchical Cluster Analysis) were applied to groundwater chemistry data to identify hydrochemical facies which are characteristic of distinct evolutionary pathways and a common hydrologic history of groundwaters. Principal Component Analysis on hydrochemical data demonstrated that natural water-rock interactions, redox potential and human agricultural impact are the key controls of groundwater quality in the Wairau Plain. Hierarchical Cluster Analysis revealed distinct hydrochemical water quality groups in the Wairau Plain groundwater system. Visualisation of the results of the multivariate statistical analyses and distribution of groundwater nitrate concentrations in the context of aquifer lithology highlighted the link between groundwater chemistry and the lithology of host aquifers. The methodology followed in this study can be applied in a variety of hydrogeological settings to synthesise geological, hydrogeological and hydrochemical data and present them in a format readily understood by a wide range of stakeholders. This enables a more efficient communication of the results of scientific studies to the wider community.
Resumo:
Pressure feeder chutes are pieces of equipment used in sugar cane crushing to increase the amount of cane that can be put through a mill. The continuous pressure feeder was developed with the objective to provide a constant feed of bagasse under pressure to the mouth of the crushing mills. The pressure feeder chute is used in a sugarcane milling unit to transfer bagasse from one set of crushing rolls to a second set of crushing rolls. There have been many pressure feeder chute failures in the past. The pressure feeder chute is quite vulnerable and if the bagasse throughput is blocked at the mill rollers, the pressure build-up in the chute can be enormous, which can ultimately result in failure. The result is substantial damage to the rollers, mill and chute construction, and downtimes of up to 48 hours can be experienced. Part of the problem is that the bagasse behaviour in the pressure feeder chute is not understood well. If the pressure feeder chute behaviour was understood, then the chute geometry design could be modified in order to minimise risk of failure. There are possible avenues for changing pressure feeder chute design and operations with a view to producing more reliable pressure feeder chutes in the future. There have been previous attempts to conduct experimental work to determine the causes of pressure feeder chute failures. There are certain guidelines available, however pressure feeder chute failures continue. Pressure feeder chute behaviour still remains poorly understood. This thesis contains the work carried out between April 14th 2009 and October 10th 2012 that focuses on the design of an experimental apparatus to measure forces and visually observe bagasse behaviour in an attempt to understand bagasse behaviour in pressure feeder chutes and minimise the risk of failure.
Resumo:
Aim A new method of penumbral analysis is implemented which allows an unambiguous determination of field size and penumbra size and quality for small fields and other non-standard fields. Both source occlusion and lateral electronic disequilibrium will affect the size and shape of cross-axis profile penumbrae; each is examined in detail. Method A new method of penumbral analysis is implemented where the square of the derivative of the cross-axis profile is plotted. The resultant graph displays two peaks in the place of the two penumbrae. This allows a strong visualisation of the quality of a field penumbra, as well as a mathematically consistent method of determining field size (distance between the two peaks maxima), and penumbra (full-widthtenth-maximum of peak). Cross-axis profiles were simulated in a water phantom at a depth of 5 cm using Monte Carlo modelling, for field sizes between 5 and 30 mm. The field size and penumbra size of each field was calculated using the method above, as well as traditional definitions set out in IEC976. The effect of source occlusion and lateral electronic disequilibrium on the penumbrae was isolated by repeating the simulations removing electron transport and using an electron spot size of 0 mm, respectively. Results All field sizes calculated using the traditional and proposed methods agreed within 0.2 mm. The penumbra size measured using the proposed method was systematically 1.8 mm larger than the traditional method at all field sizes. The size of the source had a larger effect on the size of the penumbra than did lateral electronic disequilibrium, particularly at very small field sizes. Conclusion Traditional methods of calculating field size and penumbra are proved to be mathematically adequate for small fields. However, the field size definition proposed in this study would be more robust amongst other nonstandard fields, such as flattening filter free. Source occlusion plays a bigger role than lateral electronic disequilibrium in small field penumbra size.
Resumo:
Barmah Forest virus (BFV) disease is an emerging mosquito-borne disease in Australia. We aimed to outline some recent methods in using GIS for the analysis of BFV disease in Queensland, Australia. A large database of geocoded BFV cases has been established in conjunction with population data. The database has been used in recently published studies conducted by the authors to determine spatio-temporal BFV disease hotspots and spatial patterns using spatial autocorrelation and semi-variogram analysis in conjunction with the development of interpolated BFV disease standardised incidence maps. This paper briefly outlines spatial analysis methodologies using GIS tools used in those studies. This paper summarises methods and results from previous studies by the authors, and presents a GIS methodology to be used in future spatial analytical studies in attempt to enhance the understanding of BFV disease in Queensland. The methodology developed is useful in improving the analysis of BFV disease data and will enhance the understanding of the BFV disease distribution in Queensland, Australia.
Resumo:
Existing planning theories tend to be limited in their analytical scope and often fail to account for the impact of many interactions between the multitudes of stakeholders involved in strategic planning processes. Although many theorists rejected structuralfunctional approaches from the 1970s, this article argues that many of structuralfunctional concepts remain relevant and useful to planning practitioners. In fact, structuralfunctional approaches are highly useful and practical when used as a foundation for systemic analysis of real-world, multi-layered, complex planning systems to support evidence-based governance reform. Such approaches provide a logical and systematic approach to the analysis of the wider governance of strategic planning systems that is grounded in systems theory and complementary to existing theories of complexity and planning. While we do not propose its use as a grand theory of planning, this article discusses how structuralfunctional concepts and approaches might be applied to underpin a practical analysis of the complex decision-making arrangements that drive planning practice, and to provide the evidence needed to target reform of poorly performing arrangements.
Resumo:
The comments I make are based on my nearly twenty years involvement in the dementia cause at both a national and international level. In preparation, I read two papers namely the Ministerial Dementia Forum Option Paper produced by KPMG Management Consultants (2014) and Analysis of Dementia Programmes and Services Funded by the Department of Social Services: Conversation Starter prepared by KPMG as a preparation document for those attending a workshop in Brisbane on April 22nd 2015. Dementia is a complex syndrome and as is often said, when you meet one person with dementia, you have met one meaning that no two persons with dementia are the same. Even in dementia care, Australia is a lucky country and there is much to be said for the quality and diversity of dementia care available for people living with dementia. Despite this, I agree with the many views expressed in the material I read that there is scope for improvement, especially in the way that services are coordinated. In saying that, I do not purport to have all the solutions nor claim to have the knowledge required to comment on all the programs covered by this review. If I appear to be a biased advocate for Alzheimers Australia across the States and Territories, it is because I have seen constant evidence of ordinary people doing extraordinary things with inadequate resources. Dementia care is not cheap and if those funding dementia services are primarily only interested in economic outcomes and benefits, the real purpose of this consultation will be defeated. In addition, nowhere in the material I have read is there any recognition that in many instances program funding is a complex mix of government (at all levels) and private funding. This makes reviewing those programs more complex and less able to be coordinated at a Departmental level. It goes without saying therefore that the Federal Government is not the only player in this game. Of all those participating in this review, Alzheimers Australia is best placed to comment on programs as it is more connected to people living with dementia and has probably the best record of consulting with them. It would appear however that their role has been reduced to that of a bit player. Without wanting to be critical, the Forum Report which deals with the comments made at a gathering of 70 individuals and organisations, only three (3) or 4.28% were actual carers of people living with dementia. Even if it is argued that a number of organisations present represented consumers, the percentage goes up only marginally to 8.57% which is hardly an endorsement of the forum being consumer driven. The predominance of those present were service providers, each with their own agenda and each seeking advantage for their business. The final point I want to make before commenting on more specific, program related issues, is that many programs being reviewed have a much longer history than is reflected in the material I have read. Their growth and development was pioneered by Alzheimers Australia organisations across the country often with no government funding. Attempts to bring about better coordination of programs were often at the behest of Alzheimers Australia but in the main were ignored. The opportunity to now put this right is long overdue.