930 resultados para Large detector systems for particle and astroparticle physics


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Large Hadron Collider (LHC) is the main particle accelerator at CERN. LHC is created with main goal to search elementary particles and help science investigate our universe. Radiation in LHC is caused by charged particles circular acceleration, therefore detectors tracing particles in existed severe conditions during the experiments must be radiation tolerant. Moreover, further upgrade of luminosity (up to 1035 cm-2s-1) requires development of particle detector’s structure. This work is dedicated to show the new type 3D stripixel detector with serious structural improvement. The new type of radiation-hard detector has a three-dimensional (3D) array of the p+ and n+ electrodes that penetrate into the detector bulk. The electrons and holes are then collected at oppositely biased electrodes. Proposed 3D stripixel detector demonstrates that full depletion voltage is lower that that for planar detectors. Low depletion voltage is one of the main advantages because only depleted part of the device is active are. Because of small spacing between electrodes, charge collection distances are smaller which results in high speed of the detector’s response. In this work is also briefly discussed dual-column type detectors, meaning consisting both n+ and p+ type columnar electrodes in its structure, and was declared that dual-column detectors show better electric filed distribution then single sided radiation detectors. The dead space or in other words low electric field region in significantly suppressed. Simulations were carried out by using Atlas device simulation software. As a simulation results in this work are represented the electric field distribution under different bias voltages.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The design methods and languages targeted to modern System-on-Chip designs are facing tremendous pressure of the ever-increasing complexity, power, and speed requirements. To estimate any of these three metrics, there is a trade-off between accuracy and abstraction level of detail in which a system under design is analyzed. The more detailed the description, the more accurate the simulation will be, but, on the other hand, the more time consuming it will be. Moreover, a designer wants to make decisions as early as possible in the design flow to avoid costly design backtracking. To answer the challenges posed upon System-on-chip designs, this thesis introduces a formal, power aware framework, its development methods, and methods to constraint and analyze power consumption of the system under design. This thesis discusses on power analysis of synchronous and asynchronous systems not forgetting the communication aspects of these systems. The presented framework is built upon the Timed Action System formalism, which offer an environment to analyze and constraint the functional and temporal behavior of the system at high abstraction level. Furthermore, due to the complexity of System-on-Chip designs, the possibility to abstract unnecessary implementation details at higher abstraction levels is an essential part of the introduced design framework. With the encapsulation and abstraction techniques incorporated with the procedure based communication allows a designer to use the presented power aware framework in modeling these large scale systems. The introduced techniques also enable one to subdivide the development of communication and computation into own tasks. This property is taken into account in the power analysis part as well. Furthermore, the presented framework is developed in a way that it can be used throughout the design project. In other words, a designer is able to model and analyze systems from an abstract specification down to an implementable specification.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study is to analyse the content of the interdisciplinary conversations in Göttingen between 1949 and 1961. The task is to compare models for describing reality presented by quantum physicists and theologians. Descriptions of reality indifferent disciplines are conditioned by the development of the concept of reality in philosophy, physics and theology. Our basic problem is stated in the question: How is it possible for the intramental image to match the external object?Cartesian knowledge presupposes clear and distinct ideas in the mind prior to observation resulting in a true correspondence between the observed object and the cogitative observing subject. The Kantian synthesis between rationalism and empiricism emphasises an extended character of representation. The human mind is not a passive receiver of external information, but is actively construing intramental representations of external reality in the epistemological process. Heidegger's aim was to reach a more primordial mode of understanding reality than what is possible in the Cartesian Subject-Object distinction. In Heidegger's philosophy, ontology as being-in-the-world is prior to knowledge concerning being. Ontology can be grasped only in the totality of being (Dasein), not only as an object of reflection and perception. According to Bohr, quantum mechanics introduces an irreducible loss in representation, which classically understood is a deficiency in knowledge. The conflicting aspects (particle and wave pictures) in our comprehension of physical reality, cannot be completely accommodated into an entire and coherent model of reality. What Bohr rejects is not realism, but the classical Einsteinian version of it. By the use of complementary descriptions, Bohr tries to save a fundamentally realistic position. The fundamental question in Barthian theology is the problem of God as an object of theological discourse. Dialectics is Barth¿s way to express knowledge of God avoiding a speculative theology and a human-centred religious self-consciousness. In Barthian theology, the human capacity for knowledge, independently of revelation, is insufficient to comprehend the being of God. Our knowledge of God is real knowledge in revelation and our words are made to correspond with the divine reality in an analogy of faith. The point of the Bultmannian demythologising programme was to claim the real existence of God beyond our faculties. We cannot simply define God as a human ideal of existence or a focus of values. The theological programme of Bultmann emphasised the notion that we can talk meaningfully of God only insofar as we have existential experience of his intervention. Common to all these twentieth century philosophical, physical and theological positions, is a form of anti-Cartesianism. Consequently, in regard to their epistemology, they can be labelled antirealist. This common insight also made it possible to find a common meeting point between the different disciplines. In this study, the different standpoints from all three areas and the conversations in Göttingen are analysed in the frameworkof realism/antirealism. One of the first tasks in the Göttingen conversations was to analyse the nature of the likeness between the complementary structures inquantum physics introduced by Niels Bohr and the dialectical forms in the Barthian doctrine of God. The reaction against epistemological Cartesianism, metaphysics of substance and deterministic description of reality was the common point of departure for theologians and physicists in the Göttingen discussions. In his complementarity, Bohr anticipated the crossing of traditional epistemic boundaries and the generalisation of epistemological strategies by introducing interpretative procedures across various disciplines.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

IT Service Management plays a key role in many IT organizations today. First IT Service Management principles founded in the early 1980s but the real adaption emerged in the end 2000s. IT Financial Management is one of IT Service Management’s processes. The main purpose of this thesis was study how IT Financial Management approach can be improved in a case company. Budgeting, accounting and charging are IT Financial Management functions. These functions are researched in this thesis. Thesis materials consist of both qualitative and quantitative material. The theoretical part consists mostly of IT Service Management literature while interviews and the case company’s information systems are researched in the empirical part. Thesis also reviews different kind of the systems which supports and automates IT Financial Management functions. The biggest challenge is the cost allocation with the current ERP system in the case company. It is worth to take group based system for allocation in use before there is a holistic system in a market. The case company should also develop its IT service processes forward.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The environmental aspect of corporate social responsibility (CSR) expressed through the process of the EMS implementation in the oil and gas companies is identified as the main subject of this research. In the theoretical part, the basic attention is paid to justification of a link between CSR and environmental management. The achievement of sustainable competitive advantage as a result of environmental capital growth and inclusion of the socially responsible activities in the corporate strategy is another issue that is of special significance here. Besides, two basic forms of environmental management systems (environmental decision support systems and environmental information management systems) are explored and their role in effective stakeholder interaction is tackled. The most crucial benefits of EMS are also analyzed to underline its importance as a source of sustainable development. Further research is based on the survey of 51 sampled oil and gas companies (both publicly owned and state owned ones) originated from different countries all over the world and providing reports on sustainability issues in the open access. To analyze their approach to sustainable development, a specifically designed evaluation matrix with 37 indicators developed in accordance with the General Reporting Initiative (GRI) guidelines for non-financial reporting was prepared. Additionally, the quality of environmental information disclosure was measured on the basis of a quality – quantity matrix. According to results of research, oil and gas companies prefer implementing reactive measures to the costly and knowledge-intensive proactive techniques for elimination of the negative environmental impacts. Besides, it was identified that the environmental performance disclosure is mostly rather limited, so that the quality of non-financial reporting can be judged as quite insufficient. In spite of the fact that most of the oil and gas companies in the sample claim the EMS to be embedded currently in their structure, they often do not provide any details for the process of their implementation. As a potential for the further development of EMS, author mentions possible integration of their different forms in a single entity, extension of existing structure on the basis of consolidation of the structural and strategic precautions as well as development of a unified certification standard instead of several ones that exist today in order to enhance control on the EMS implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The current research emphasizes on various questions raised and deliberated upon by different entrepreneurs. It provides a valuable contribution to comprehend the importance of social media and ICT-applications. Furthermore, it demonstrates how to support and implement the management consulting and business coaching start-ups with the help of social media and ICT-tools. The thesis presents a literary review from different information systems science, SME and e-business journals, web articles, as well as, survey analysis reports on social media applications. The methodology incorporated into a qualitative research method in which social anthropological approaches were used to oversee the case study activities in order to collect data. The collaborative social research approach was used to shelter the action research method. The research discovered that new business start-ups, as well as small businesses do not use social media and ICT-tools, unlike most of the large corporations use. At present, the current open-source ICT-technologies and social media applications are equally available for new and small businesses as they are available for larger companies. Successful implementation of social media and ICT-applications can easily enhance start-up performance and overcome business hassles. The thesis sheds some light on effective and innovative implementation of social media and ICT-applications for new business risk takers and small business birds. Key words

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Toxoplasmosis is a zoonotic disease caused by the protozoan Toxoplasma gondii. The aim of the present study was to determine the occurrence and identify the risk factors associated with transmission of T. gondii to chickens raised in different systems (free-ranged and confined) to produce eggs or meat. The 810 animals were allocated in two experimental groups according to the production system purpose: 460 broiler chickens (Group 1) and 350 layer chickens (Group 2). In order to analyze the possible factors involved in T. gondii infection in the chickens, an epidemiological questionnaire was developed for all properties.The serological detection of anti-Toxoplasma gondii antibodies was performed by Indirect Immunofluorescence (IFAT) and by Enzime Linked Imunossorbent Assay (ELISA). Since the agreement index (kappa) between these two serological techniques was considered high, 21.2% of the 810 animals were considered reactive. In Group 1, 12.2% (56/460) were positive, while in the Group 2 the positivity rate was 33.1% (116/350). The production system may be influencing the seropositivity of the animals in both groups. However, only in Group 2 it was possible to notice a statistically significant relationship between the breeding system and the frequency of positive sera. This result indicates that, at least for laying hens, the production system is directly involved in T. gondii infection. The contact with cats in Group 1 did not influence the distribution of seroreactive animals, but in Group 2 a significant relationship was observed. The occurrence of anti-T. gondii antibodies was high in both groups (broiler and posture chickens). Free-ranged chickens raised for egg production proved to be the most exposed group to the T. gondii infection. This can be related to the fact that these animals stay for longer periods in the farms, in direct contact with possibly contaminated soil by the presence of domestic cats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis studies quality, productivity and economy in welding manufacturing in West African states such as Ghana, Nigeria and Cameroon. The study consists of two parts: the first part, which forms the theoretical background, reviews relevant literature concerning the metal and welding industries, and measurement of welding quality, productivity and economy. The second part, which is the empirical part, aims to identify activities in the metal manufacturing industries where welding is extensively used and to determine the extent of welding quality, productivity and economy measurements in companies operating in the metal manufacturing industries. Additionally, the thesis aims to identify challenges that companies face and to assess the feasibility of creating a network to address these issues. The research methods used in the empirical part are the case study (qualitative) method and the survey (quantitative) method. However, the case study method was used to elicit information from companies in Ghana, while the survey method was used to elicit information from companies in Nigeria and Cameroon. The study considers important areas that contribute to creating awareness and understanding of the current situation of the welding industry in West Africa. These areas include the metal manufacturing industrial sector, metal products manufactured, metal production and manufacturing systems deployed, welding quality, productivity and economy measurement systems utilized, equipment and materials on the markets, general challenges facing companies in welding operations, welding technology programs and research in local universities, and SWOT analysis of the various West African states. The notable findings indicate that majority of the companies operate in the constructionindustrial sector. Also, majority of the companies are project manufacturing oriented, thus provide services to customers operating in the growing industries such as the oil and gas, mining, food and the energy industry. In addition, only few companies are certified under standards such as ISO 9001, ISO 3834, and OHSAS 18001. More so, majority of the companies employ manual welding technique, and shielded metal arc welding (SMAW) as the commonly used welding process. Finally, welder salary is about € 300 / month as of June 2013 and the average operations turnover of medium to large companies is about € 5 million / year as at 2012. Based on analysis of the results of the study, it is noted that while welding activities are growing, the availability of cheap labor, the need for company and welder qualification and certification, and the need to manufacture innovative products through developmental projects (transfer of welding expertise and technology) remain as untapped opportunities in the welding industry in the West African states. The study serves as a solid platform for further research and concludes with several recommendations for development of the West African welding industry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cotton is highly susceptible to the interference imposed by weed community, being therefore essential to adopt control measures ensuring the crop yield. Herbicides are the primary method of weed control in large-scale areas of production, and usually more than one herbicide application is necessary due to the extensive crop cycle. This study aimed to evaluate the selectivity of different chemical weed control systems for conventional cotton. The experiment took place in the field in a randomized block design, with twenty nine treatments and four replications in a split plot layout (adjacent double check). Results showed that triple mixtures in pre-emergence increased the chance of observing reductions in the cotton yield. To avoid reductions in crop yield, users should proceed to a maximum mixture of two herbicides in pre-emergence, followed by S-metolachlor over the top, followed by one post-emergence mixture application of pyrithiobac-sodium + trifloxysulfuron-sodium.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study is based on a large survey study of over 1500 Finnish companies’ usage, needs and implementation difficulties of management accounting systems. The study uses quantitative, qualitative and mixed methods to answer the research questions. The empirical data used in the study was gathered through structured interviews with randomly selected companies of varying sizes and industries. The study answers the three research questions by analyzing the characteristics and behaviors of companies working in Finland. The study found five distinctive groups of companies according to the characteristics of their cost information and management accounting system use. The study also showed that the state of cost information and management accounting systems depends on the industry and size of the companies. It was found that over 50% of the companies either did not know how their systems could be updated or saw systems as inadequate. The qualitative side also highlighted the needs for tailored and integrated management accounting systems for creating more value to the managers of companies. The major inhibitors of new system implementation were the lack of both monetary and human resources. Through the use of mixed methods and design science a new and improved sophistication model is created based on previous research results combined with the information gathered from previous literature. The sophistication model shows the different stages of management accounting systems in use and what companies can achieve with the implementation and upgrading of their systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis focuses on the water chemistry of the experimental test facilities and their reference VVER reactors. The main objective of the thesis is to provide recommendations for water chemistry management for laboratory facilities (VEERA, PACTEL) simulating the VVERs and for the large future facilities of the Lappeenranta University of Technology. In the beginning, the concept of nuclear power generation and the applicability of the nuclear power usage is discussed. Next, different water chemistry and water purification systems in primary and secondary circuits currently used at the power plant have been outlined. Also the construction geometry and design of test facilities PACTEL and VEERA, as well as the operation principles of their main equipment has been described. Finally, the appropriate water chemistry and water treatment system have been proposed for the existing and future experimental facilities of LUT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main objective of this thesis is to evaluate the economic and environmental effectiveness of three different renewable energy systems: solar PV, wind energy and biomass energy systems. Financial methods such as Internal Rate of Return (IRR) and Modified Internal Rate of Return (MIRR) were used to evaluate economic competitiveness. Seasonal variability in power generation capability of different renewable systems were also taken into consideration. In order to evaluate the environmental effectiveness of different energy systems, default values in GaBi software were taken by defining the functional unit as 1kWh. The results show that solar PV systems are difficult to justify both in economic as well as environmental grounds. Wind energy performs better in both economic and environmental grounds and has the capability to compete with conventional energy systems. Biomass energy systems exhibit environmental and economic performance at the middle level. In each of these systems, results vary.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method for sampling the exact (within the nodal error) ground state distribution and nondiflPerential properties of multielectron systems is developed and applied to firstrow atoms. Calculated properties are the distribution moments and the electronic density at the nucleus (the 6 operator). For this purpose, new simple trial functions are developed and optimized. First, using Hydrogen as a test case, we demonstrate the accuracy of our algorithm and its sensitivity to error in the trial function. Applications to first row atoms are then described. We obtain results which are more satisfactory than the ones obtained previously using Monte Carlo methods, despite the relative crudeness of our trial functions. Also, a comparison is made with results of highly accurate post-Hartree Fock calculations, thereby illuminating the nodal error in our estimates. Taking into account the CPU time spent, our results, particularly for the 8 operator, have a relatively large variance. Several ways of improving the eflSciency together with some extensions of the algorithm are suggested.