863 resultados para Possible solutions
Resumo:
O contexto da modernização reflexiva leva ao questionamento sobre o papel das instituições tradicionais, notadamente o Poder Executivo. É possível pressupor que o modelo da sociedade de risco se reflete em alterações no direito e na economia do Brasil. Os riscos tornaram-se uma constante e exigem substituições nas formas de atuação social, o que Ulrich Beck denomina de subpolítica. A mudança é sentida especialmente em setores centrais para economia, tais como o de petróleo e gás, enquanto os riscos globais são sentidos na crise ambiental. Uma vez que as instituições, na Sociedade de Risco, são levadas a rever sua atuação, o empresariado recebe a tarefa de encontrar mecanismos para superar a crise ambiental. A responsabilidade socioambiental da empresa passa a ser exigida como contrapartida pelos lucros obtidos, especialmente para atividades potencialmente poluentes, como a petrolífera. O princípio da precaução, o desenvolvimento sustentável e a equação financeira do contrato podem ser vetores para a adoção da responsabilidade socioambiental pela indústria petrolífera. Mas para que esta possa ser vista como uma nova razão pública deve se demonstrar que ela pode motivar a evolução da sociedade como um todo. A Rio+20 definiu a economia verde como uma nova meta, principalmente para as atividades potencialmente poluentes. O objetivo central do trabalho é investigar a regulação das empresas de petróleo e gás, especialmente sobre a possibilidade de adoção da responsabilidade socioambiental. Ressalte-se que esta visa impor medidas de conservação e ações pró meio ambiente, além daquelas já estipuladas por força legal ou pelo licenciamento ambiental. A pesquisa visa apresentar possíveis soluções para os questionamentos acima, garantindo segurança jurídica para empresas de petróleo e gás, mas ao mesmo tempo visando ampliar a sustentabilidade do setor, propondo novas regras que podem ser adotadas nos editais de licitação e nos contratos de concessão e partilha de produção. Em um momento que se debate a possibilidade de direcionamento dos royalties do petróleo exclusivamente para a educação o estudo de medidas jurídicas para implementar a responsabilidade socioambiental no setor petrolífero torna-se ainda mais necessária.
Resumo:
This is the report from the South and West Cumberland Fisheries Advisory Committee meeting, which was held on the 13th October 1975. The report contains information on the impact of work on the A66 Penrith to Workington road, brown trout fishing, the development of the Ehen as a fishery, reports of fisheries activities, Holmwrangle hatchery, land drainage representation, new fishery byelaws and fishing licence duties. The section on fisheries activities includes runs of fish, stocking, poaching and biological work. The section on Holmwrangle hatchery includes mortality numbers and possible solutions to avoid future pipe chokes and to improve the pumping system. The section on land drainage is purely for an up-to-date picture of developments. The Fisheries Advisory Committee was part of the Regional Water Authorities, in this case the North West Water Authority. This preceded the Environment Agency which came into existence in 1996.
Resumo:
De acordo com dados do DATASUS, o setor filantrópico no país possui cerca de 2.100 estabelecimentos hospitalares, com mais de 155.000 leitos, o que representa 31% do total de leitos do país. Ou seja, 1/3 dos leitos existentes no país são filantrópicos, caracterizando o setor como importante prestador de serviços ao Sistema Único de Saúde, assim como à saúde suplementar. Estes números demonstram a importância do setor filantrópico para o país, mas também para muitas regiões, como a região centro sul fluminense do Estado do Rio de Janeiro. A enorme complexidade e diversidade da rede hospitalar filantrópica traz muitos e variados desafios, tanto para a operação hospitalar, como para as políticas governamentais de manutenção do setor saúde. Além da defasagem da tabela do SUS, outros problemas também são enfrentados por estes hospitais como os tetos financeiros, que levam ao pagamento menor do que é efetivamente produzido, atrasos de pagamentos, que dificultam o planejamento e o equilíbrio financeiro, fechamento de linhas de créditos, dificuldades nas negociações com gestores, entre muitos outros. Todas essas dificuldades têm reflexos críticos na gestão dessas organizações que passam por crises financeiras, necessidade de qualificação profissional e de adequações em suas instalações e equipamentos. Esta pesquisa propõe-se a estudar a situação dos hospitais filantrópicos da região Centro-Sul Fluminense do Estado do Rio de Janeiro no âmbito financeiro, assim como conhecer a percepção dos gestores sobre as dificuldades do setor, suas perspectivas e propostas de possíveis soluções. A metodologia utilizada foi de natureza exploratória, com abordagem qualitativa e quantitativa, com estudo de casos múltiplos, utilizando-se de diversas fontes para coleta de dados, como fontes primárias, secundárias e entrevista semi-estruturada. Os resultados demonstraram a importância dos hospitais filantrópicos na produção de serviços de saúde para o SUS e a necessidade da formulação de políticas especificas para a manutenção do setor filantrópico conveniado ao SUS. Na região centro sul fluminense há uma complementaridade entre a rede pública e a rede filantrópica. Embora seja reconhecida a importância histórica e estratégica desses hospitais na prestação de serviços ao SUS, mantidas as condições atuais nos âmbitos de capacidade instalada, organizacional ou financeira, não há dúvidas de que o futuro dessas instituições será incerto com riscos importantes para a continuidade dos serviços prestados e da própria sobrevivência dos hospitais.Existe a necessidade urgente de que os gestores públicos busquem a implantação de modelos flexíveis de gestão, visando o estabelecimento de um relacionamento de parceria com os hospitais filantrópicos e suas ações sejam definidas por metas tangíveis e alcançáveis.
Resumo:
EXTRACT (SEE PDF FOR FULL ABSTRACT): High-resolution proxy records of climate, such as varves, ice cores, and tree-rings, provide the opportunity for reconstructing climate on a year-by-year basis. In order to do so it is necessary to approximate the complex nonlinear response function of the natural recording system using linear statistical models. Three problems with this approach were discussed, and possible solutions were suggested. Examples were given from a reconstruction of Santa Barbara precipitation based on tree-ring records from Santa Barbara County.
Resumo:
When considering the potential uptake and utilization of technology management tools by industry, it must be recognized that companies face the difficult challenges of selecting, adopting and integrating individual tools into a toolkit that must be implemented within their current organizational processes and systems. This situation is compounded by the lack of sound advice on integrating well-founded individual tools into a robust toolkit that has the necessary degree of flexibility such that they can be tailored for application to specific problems faced by individual organizations. As an initial stepping stone to offering a toolkit with empirically proven utility, this paper provides a conceptual foundation to the development of toolkits by outlining an underlying philosophical position based on observations from multiple research and commercial collaborations with industry. This stance is underpinned by a set of operationalized principles that can offer guidance to organizations when deciding upon the appropriate form, functions and features that should be embodied by any potential tool/toolkit. For example, a key objective of any tool is to aid decision-making and a core set of powerful, flexible, scaleable and modular tools should be sufficient to allow users to generate, explore, shape and implement possible solutions across a wide array of strategic issues. From our philosophical stance, the preferred mode of engagement is facilitated workshops with a participatory process that enables multiple perspectives and structures the conversation through visual representations in order to manage the cognitive load in the collaborative environment. The generic form of the tools should be configurable for the given context and utilized in a lightweight manner based on the premise of start small and iterate fast. © 2011 IEEE.
Resumo:
When considering the potential uptake and utilization of technology management tools by industry, it must be recognized that companies face the difficult challenges of selecting, adopting and integrating individual tools into a toolkit that must be implemented within their current organizational processes and systems. This situation is compounded by the lack of sound advice on integrating well-founded individual tools into a robust toolkit that has the necessary degree of flexibility such that they can be tailored for application to specific problems faced by individual organizations. As an initial stepping stone to offering a toolkit with empirically proven utility, this paper provides a conceptual foundation to the development of toolkits by outlining an underlying philosophical position based on observations from multiple research and commercial collaborations with industry. This stance is underpinned by a set of operationalized principles that can offer guidance to organizations when deciding upon the appropriate form, functions and features that should be embodied by any potential tool/toolkit. For example, a key objective of any tool is to aid decision-making and a core set of powerful, flexible, scaleable and modular tools should be sufficient to allow users to generate, explore, shape and implement possible solutions across a wide array of strategic issues. From our philosophical stance, the preferred mode of engagement is facilitated workshops with a participatory process that enables multiple perspectives and structures the conversation through visual representations in order to manage the cognitive load in the collaborative environment. The generic form of the tools should be configurable for the given context and utilized in a lightweight manner based on the premise of 'start small and iterate fast'. © 2012 Elsevier Inc.
Resumo:
The use of III-nitride-based light-emitting diodes (LEDs) is now widespread in applications such as indicator lamps, display panels, backlighting for liquid-crystal display TVs and computer screens, traffic lights, etc. To meet the huge market demand and lower the manufacturing cost, the LED industry is moving fast from 2 inch to 4 inch and recently to 6 inch wafer sizes. Although Al2O3 (sapphire) and SiC remain the dominant substrate materials for the epitaxy of nitride LEDs, the use of large Si substrates attracts great interest because Si wafers are readily available in large diameters at low cost. In addition, such wafers are compatible with existing processing lines for 6 inch and larger wafers commonly used in the electronics industry. During the last decade, much exciting progress has been achieved in improving the performance of GaN-on-Si devices. In this contribution, the status and prospects of III-nitride optoelectronics grown on Si substrates are reviewed. The issues involved in the growth of GaN-based LED structures on Si and possible solutions are outlined, together with a brief introduction to some novel in situ and ex situ monitoring/characterization tools, which are especially useful for the growth of GaN-on-Si structures.
Resumo:
In the present review, the authors do not try to provide a comprehensive review of researches on polymer/clay nanocomposites (PCNs), but some examples to demonstrate different exfoliation processes of the clay in various polymer matrixes and the dispersed state of clay. Interaction between polymers and layered silicates plays an important role in adjusting the exfoliation process of layered silicates and the microstructure of polymer nanocomposites. Properties of polymer/layered silicate nanocomposites mainly depend on the dispersed state of layered silicates. The authors will also address the outline of the present research in the direction of PCNs including the discussion of technical problems and their possible solutions.
Resumo:
The dissertation addressed the problems of signals reconstruction and data restoration in seismic data processing, which takes the representation methods of signal as the main clue, and take the seismic information reconstruction (signals separation and trace interpolation) as the core. On the natural bases signal representation, I present the ICA fundamentals, algorithms and its original applications to nature earth quake signals separation and survey seismic signals separation. On determinative bases signal representation, the paper proposed seismic dada reconstruction least square inversion regularization methods, sparseness constraints, pre-conditioned conjugate gradient methods, and their applications to seismic de-convolution, Radon transformation, et. al. The core contents are about de-alias uneven seismic data reconstruction algorithm and its application to seismic interpolation. Although the dissertation discussed two cases of signal representation, they can be integrated into one frame, because they both deal with the signals or information restoration, the former reconstructing original signals from mixed signals, the later reconstructing whole data from sparse or irregular data. The goal of them is same to provide pre-processing methods and post-processing method for seismic pre-stack depth migration. ICA can separate the original signals from mixed signals by them, or abstract the basic structure from analyzed data. I surveyed the fundamental, algorithms and applications of ICA. Compared with KL transformation, I proposed the independent components transformation concept (ICT). On basis of the ne-entropy measurement of independence, I implemented the FastICA and improved it by covariance matrix. By analyzing the characteristics of the seismic signals, I introduced ICA into seismic signal processing firstly in Geophysical community, and implemented the noise separation from seismic signal. Synthetic and real data examples show the usability of ICA to seismic signal processing and initial effects are achieved. The application of ICA to separation quake conversion wave from multiple in sedimentary area is made, which demonstrates good effects, so more reasonable interpretation of underground un-continuity is got. The results show the perspective of application of ICA to Geophysical signal processing. By virtue of the relationship between ICA and Blind Deconvolution , I surveyed the seismic blind deconvolution, and discussed the perspective of applying ICA to seismic blind deconvolution with two possible solutions. The relationship of PC A, ICA and wavelet transform is claimed. It is proved that reconstruction of wavelet prototype functions is Lie group representation. By the way, over-sampled wavelet transform is proposed to enhance the seismic data resolution, which is validated by numerical examples. The key of pre-stack depth migration is the regularization of pre-stack seismic data. As a main procedure, seismic interpolation and missing data reconstruction are necessary. Firstly, I review the seismic imaging methods in order to argue the critical effect of regularization. By review of the seismic interpolation algorithms, I acclaim that de-alias uneven data reconstruction is still a challenge. The fundamental of seismic reconstruction is discussed firstly. Then sparseness constraint on least square inversion and preconditioned conjugate gradient solver are studied and implemented. Choosing constraint item with Cauchy distribution, I programmed PCG algorithm and implement sparse seismic deconvolution, high resolution Radon Transformation by PCG, which is prepared for seismic data reconstruction. About seismic interpolation, dealias even data interpolation and uneven data reconstruction are very good respectively, however they can not be combined each other. In this paper, a novel Fourier transform based method and a algorithm have been proposed, which could reconstruct both uneven and alias seismic data. I formulated band-limited data reconstruction as minimum norm least squares inversion problem where an adaptive DFT-weighted norm regularization term is used. The inverse problem is solved by pre-conditional conjugate gradient method, which makes the solutions stable and convergent quickly. Based on the assumption that seismic data are consisted of finite linear events, from sampling theorem, alias events can be attenuated via LS weight predicted linearly from low frequency. Three application issues are discussed on even gap trace interpolation, uneven gap filling, high frequency trace reconstruction from low frequency data trace constrained by few high frequency traces. Both synthetic and real data numerical examples show the proposed method is valid, efficient and applicable. The research is valuable to seismic data regularization and cross well seismic. To meet 3D shot profile depth migration request for data, schemes must be taken to make the data even and fitting the velocity dataset. The methods of this paper are used to interpolate and extrapolate the shot gathers instead of simply embedding zero traces. So, the aperture of migration is enlarged and the migration effect is improved. The results show the effectiveness and the practicability.
Resumo:
In Circum-Bohai region (112°~124°E, 34°~42°N ), there exists rich gas-petroleum while inner-plate seismic activity is robust. Although the tectonic structure of this region is very complicated, plenty of geological, geophysical and geochemical researches have been carried out.In this paper, guided by the ideas of "One, Two, Three and Many" and "The depth controls the shallow, the regional constrains the local", I fully take advantage of previous results so as to establish a general image of this region. After collecting the arrival-time of P-wave phases of local events and tele-seismic events recorded by the stations within this region from 1966 to 2004, I process all these data and build an initial model. Then, a tomography image of crust and upper-mantle of this region is obtained. With reference to previous results, we compare the image of various depths and five cross-profiles traverse this region along different direction. And finally, a discussion and conclusion is made.The principle contents is listed as below: 1) in the first chapter, the purpose and meaning of this thesis, the advance in seismic tomography, and the research contents and blue-print is stated; 2) in the second chapter, I introduce the regional geological setting of Circum-Bohai region, describe the tectonic and evolutionary characteristics of principle tectonic units, including Bohai Bay Basin, Yanshan Fold Zone, Taihangshan Uplifted Zone, Jiao-Niao Uplifted Zone and Luxi Uplifted Zone, and primary deep faults; 3) In the third chapter, the previous geophysical researches, i.e., gravity and geomagnetic characters, geothermal flow, seismic activity, physical character of rocks, deep seismic sounding, and previous seismic tomography, are discussed; 4) in the fourth chapter, the fundamental theory and approach of seismic tomography is introduced; 5) in the fifth chapter, the technology and approaches used in this thesis, including collecting and pre-processing of data, the establishment of initial velocity model and relocation of all events; 6) in the sixth chapter, I discuss and analyze the tomography image of various depth and five cross-sections; 7)in the seventh chapter, I make a conclusion of the results, state the existing problems and possible solutions.
Resumo:
Malicious software (malware) have significantly increased in terms of number and effectiveness during the past years. Until 2006, such software were mostly used to disrupt network infrastructures or to show coders’ skills. Nowadays, malware constitute a very important source of economical profit, and are very difficult to detect. Thousands of novel variants are released every day, and modern obfuscation techniques are used to ensure that signature-based anti-malware systems are not able to detect such threats. This tendency has also appeared on mobile devices, with Android being the most targeted platform. To counteract this phenomenon, a lot of approaches have been developed by the scientific community that attempt to increase the resilience of anti-malware systems. Most of these approaches rely on machine learning, and have become very popular also in commercial applications. However, attackers are now knowledgeable about these systems, and have started preparing their countermeasures. This has lead to an arms race between attackers and developers. Novel systems are progressively built to tackle the attacks that get more and more sophisticated. For this reason, a necessity grows for the developers to anticipate the attackers’ moves. This means that defense systems should be built proactively, i.e., by introducing some security design principles in their development. The main goal of this work is showing that such proactive approach can be employed on a number of case studies. To do so, I adopted a global methodology that can be divided in two steps. First, understanding what are the vulnerabilities of current state-of-the-art systems (this anticipates the attacker’s moves). Then, developing novel systems that are robust to these attacks, or suggesting research guidelines with which current systems can be improved. This work presents two main case studies, concerning the detection of PDF and Android malware. The idea is showing that a proactive approach can be applied both on the X86 and mobile world. The contributions provided on this two case studies are multifolded. With respect to PDF files, I first develop novel attacks that can empirically and optimally evade current state-of-the-art detectors. Then, I propose possible solutions with which it is possible to increase the robustness of such detectors against known and novel attacks. With respect to the Android case study, I first show how current signature-based tools and academically developed systems are weak against empirical obfuscation attacks, which can be easily employed without particular knowledge of the targeted systems. Then, I examine a possible strategy to build a machine learning detector that is robust against both empirical obfuscation and optimal attacks. Finally, I will show how proactive approaches can be also employed to develop systems that are not aimed at detecting malware, such as mobile fingerprinting systems. In particular, I propose a methodology to build a powerful mobile fingerprinting system, and examine possible attacks with which users might be able to evade it, thus preserving their privacy. To provide the aforementioned contributions, I co-developed (with the cooperation of the researchers at PRALab and Ruhr-Universität Bochum) various systems: a library to perform optimal attacks against machine learning systems (AdversariaLib), a framework for automatically obfuscating Android applications, a system to the robust detection of Javascript malware inside PDF files (LuxOR), a robust machine learning system to the detection of Android malware, and a system to fingerprint mobile devices. I also contributed to develop Android PRAGuard, a dataset containing a lot of empirical obfuscation attacks against the Android platform. Finally, I entirely developed Slayer NEO, an evolution of a previous system to the detection of PDF malware. The results attained by using the aforementioned tools show that it is possible to proactively build systems that predict possible evasion attacks. This suggests that a proactive approach is crucial to build systems that provide concrete security against general and evasion attacks.
Resumo:
p.173-186
Resumo:
The recent history and current trends in the collection and archiving of forest information and models is reviewed. The question is posed as to whether the community of forest modellers ought to take some action in setting up a Forest Model Archive (FMA) as a means of conserving and sharing the heritage of forest models that have been developed over several decades. The paper discusses the various alternatives of what an FMA could be, and should be. It then goes on to formulate a conceptual model as the basis for the construction of a FMA. Finally the question of software architecture is considered. Again there are a number of possible solutions. We discuss the alternatives, some in considerable detail, but leave the final decisions on these issues to the forest modelling community. This paper has spawned the “Greenwich Initiative” on the FMA. An internet discussion group on the topic will be started and launched by the “Trafalar Group”, which will span both IUFRO 4.1 and 4.11, and further discussion is planned to take place at the Forest Modelling Conference in Portugal, June 2002.
Resumo:
Mycobacterium avium ssp. paratuberculosis (MAP) causes Johne's disease in cattle and other ruminants and has been implicated as a possible cause of Crohn's disease in humans. The organism gains access to raw milk directly through excretion into the milk within the udder and indirectly through faecal contamination during milking. MAP has been shown to survive commercial pasteurization in naturally infected milk, even at the extended holding time of 25 s. Pasteurized milk must therefore be considered a vehicle of transmission of MAP to humans. isolation methods for MAP from milk are problematical, chiefly because of the absence of a suitable selective medium. This makes food surveillance programs and research on this topic difficult. The MAP problem can be addressed in two main ways: by devising a milk-processing strategy that ensures the death of the organism: and/or strategies at farm level to prevent access of the organism into raw milk. Much of the research to date has been devoted to determining ifa problem exists and, if so, the extent of the problem. Little has been directed at possible solutions. Given the current state of information on this topic and the potential consequences for the dairy industry research is urgently needed so that a better understanding of the risks and the efficacy of possible processing solutions can be determined.