973 resultados para integrated approaches
Resumo:
The global interest towards renewable energy production such as wind and solar energy is increasing, which in turn calls for new energy storage concepts due to the larger share of intermittent energy production. Power-to-gas solutions can be utilized to convert surplus electricity to chemical energy which can be stored for extended periods of time. The energy storage concept explored in this thesis is an integrated energy storage tank connected to an oxy-fuel combustion plant. Using this approach, flue gases from the plant could be fed directly into the storage tank and later converted into synthetic natural gas by utilizing electrolysis-methanation route. This work utilizes computational fluid dynamics to model the desublimation of carbon dioxide inside a storage tank containing cryogenic liquid, such as liquefied natural gas. Numerical modelling enables the evaluation of the transient flow patterns caused by the desublimation, as well as general fluid behaviour inside the tank. Based on simulations the stability of the cryogenic storage and the magnitude of the key parameters can be evaluated.
Resumo:
The main goal of this work is to clarify the idea of two thermochemical conversion processes of biomass - pyrolysis and torrefaction and to identify possible ways how and where exactly these processes can be integrated. Integration into CHP power plant process was chosen as one of the most promising ways. Multiple product development was determined by means of this integration concept. The analysis of the possible pros and cons was made based on some experimental data collected from the previous studies related to the topic of my work. In addition, one real integrated case was represented in the last part of the work. Finally, to highlight the main idea brief summarizing was done.
Resumo:
The brain is a complex system, which produces emergent properties such as those associated with activity-dependent plasticity in processes of learning and memory. Therefore, understanding the integrated structures and functions of the brain is well beyond the scope of either superficial or extremely reductionistic approaches. Although a combination of zoom-in and zoom-out strategies is desirable when the brain is studied, constructing the appropriate interfaces to connect all levels of analysis is one of the most difficult challenges of contemporary neuroscience. Is it possible to build appropriate models of brain function and dysfunctions with computational tools? Among the best-known brain dysfunctions, epilepsies are neurological syndromes that reach a variety of networks, from widespread anatomical brain circuits to local molecular environments. One logical question would be: are those complex brain networks always producing maladaptive emergent properties compatible with epileptogenic substrates? The present review will deal with this question and will try to answer it by illustrating several points from the literature and from our laboratory data, with examples at the behavioral, electrophysiological, cellular and molecular levels. We conclude that, because the brain is a complex system compatible with the production of emergent properties, including plasticity, its functions should be approached using an integrated view. Concepts such as brain networks, graphics theory, neuroinformatics, and e-neuroscience are discussed as new transdisciplinary approaches dealing with the continuous growth of information about brain physiology and its dysfunctions. The epilepsies are discussed as neurobiological models of complex systems displaying maladaptive plasticity.
Resumo:
Single-photon emission computed tomography (SPECT) is a non-invasive imaging technique, which provides information reporting the functional states of tissues. SPECT imaging has been used as a diagnostic tool in several human disorders and can be used in animal models of diseases for physiopathological, genomic and drug discovery studies. However, most of the experimental models used in research involve rodents, which are at least one order of magnitude smaller in linear dimensions than man. Consequently, images of targets obtained with conventional gamma-cameras and collimators have poor spatial resolution and statistical quality. We review the methodological approaches developed in recent years in order to obtain images of small targets with good spatial resolution and sensitivity. Multipinhole, coded mask- and slit-based collimators are presented as alternative approaches to improve image quality. In combination with appropriate decoding algorithms, these collimators permit a significant reduction of the time needed to register the projections used to make 3-D representations of the volumetric distribution of target’s radiotracers. Simultaneously, they can be used to minimize artifacts and blurring arising when single pinhole collimators are used. Representation images are presented, which illustrate the use of these collimators. We also comment on the use of coded masks to attain tomographic resolution with a single projection, as discussed by some investigators since their introduction to obtain near-field images. We conclude this review by showing that the use of appropriate hardware and software tools adapted to conventional gamma-cameras can be of great help in obtaining relevant functional information in experiments using small animals.
Resumo:
Human epidermal growth factor receptor 2 (HER2) has been evaluated in breast cancer patients to identify those most likely to benefit from herceptin-targeted therapy. HER2 amplification, detected in 20-30% of invasive breast tumors, is associated with reduced survival and metastasis. The most frequently used technique for evaluating HER2 protein status as a routine procedure is immunohistochemistry (IHC). HER2 copy number alterations have also been evaluated by fluorescence in situ hybridization (FISH) in moderate immunoexpression (IHC 2+) cases. An alternative procedure to evaluate gene amplification is chromogenic in situhybridization (CISH), which has some advantages over FISH, including the correlation between HER2 status and morphological features. Other methodologies have also been used, such as silver-enhanced in situ hybridization (SISH) and quantitative real-time RT-PCR, to determine the number of HER2 gene copies and expression, respectively. Here we will present a short and comprehensive review of the current advances concerning HER2 evaluation in human breast cancer.
Resumo:
This study aimed to assess the efficacy of a rural community-based integrated intervention for early prevention and management of chronic obstructive pulmonary disease (COPD) in China. This 18-year cluster-randomized controlled trial encompassing 15 villages included 1008 patients (454 men and 40 women in the intervention group [mean age, 54 ± 10 years]; 482 men and 32 women in the control group [mean age, 53 ± 10 years]) with confirmed COPD or at risk for COPD. Villages were randomly assigned to the intervention or the control group, and study participants residing within the villages received treatment accordingly. Intervention group patients took part in a program that included systematic health education, smoking cessation counseling, and education on management of COPD. Control group patients received usual care. The groups were compared after 18 years regarding the incidence of COPD, decline in lung function, and mortality of COPD. COPD incidence was lower in the intervention group than in the control group (10% vs 16%, <0.05). A decline in lung function was also significantly delayed in the intervention group compared to the control group of COPD and high-risk patients. The intervention group showed significant improvement in smoking cessation compared with the control group, and smokers in the intervention group had lower smoking indices than in the control group (350 vs 450, <0.05). The intervention group also had a significantly lower cumulative COPD-related death rate than the control group (37% vs 47%, <0.05). A rural community-based integrated intervention is effective in reducing the incidence of COPD among those at risk, delaying a decline in lung function in COPD patients and those at risk, and reducing mortality of COPD.
Resumo:
Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.
Resumo:
Ontology matching is an important task when data from multiple data sources is integrated. Problems of ontology matching have been studied widely in the researchliterature and many different solutions and approaches have been proposed alsoin commercial software tools. In this survey, well-known approaches of ontologymatching, and its subtype schema matching, are reviewed and compared. The aimof this report is to summarize the knowledge about the state-of-the-art solutionsfrom the research literature, discuss how the methods work on different application domains, and analyze pros and cons of different open source and academic tools inthe commercial world.
Resumo:
Imagine the potential implications of an organization whose business and IT processes are well aligned and are capable of reactively and proactively responding to the external and internal changes. The Philips IT Infrastructure and Operations department (I&O) is undergoing a series of transformation activities to help Philips business keeping up with the changes. I&O would serve a critical function in any business sectors; given that the I&O’s strategy switched from “design, build and run” to “specify, acquire and performance manage”, that function is amplified. In 2013, I&O’s biggest transforming programme I&O Futures engaged multiple interdisciplinary departments and programs on decommissioning legacy processes and restructuring new processes with respect to the Information Technology Internet Library (ITIL), helping I&O to achieve a common infrastructure and operating platform (CI&OP). The author joined I&O Futures in the early 2014 and contributed to the CI&OP release 1, during which a designed model Bing Box and its evaluations were conducted through the lens of six sigma’s structured define-measure-analyze-improve-control (DMAIC) improvement approach. This Bing Box model was intended to firstly combine business and IT principles, namely Lean IT, Agile, ITIL best practices, and Aspect-oriented programming (AOP) into a framework. Secondly, the author implemented the modularized optimization cycles according to the defined framework into Philips’ ITIL-based processes and, subsequently, to enhance business process performance as well as to increase efficiency of the optimization cycles. The unique of this thesis is that the Bing Box model not only provided comprehensive optimization approaches and principles for business process performance, but also integrated and standardized optimization modules for the optimization process itself. The research followed a design research guideline that seek to extend the boundaries of human and organizational capabilities by creating new and innovative artifacts. The Chapter 2 firstly reviewed the current research on Lean Six Sigma, Agile, AOP and ITIL, aiming at identifying the broad conceptual bases for this study. In Chapter 3, we included the process of constructing the Bing Box model. The Chapter 4 described the adoption of Bing Box model: two-implementation case validated by stakeholders through observations and interviews. Chapter 5 contained the concluding remarks, the limitation of this research work and the future research areas. Chapter 6 provided the references used in this thesis.
Resumo:
Yli puolet kuntien vuosibudjeteista kuluu sosiaali- ja terveyspalveluihin. Jatkossa väestö ikääntyy ja huoltosuhteen muutos tulee niukentamaan kuntien resursseja ja lisäämään palvelujen tarvetta. Näin ollen vaikuttavien ja kustannustehokkaiden ratkaisumallien luominen on ensiarvoisen tärkeää. Tutkimuksen tavoitteena on rakentaa malli, jota voidaan hyödyntää tuottavuuden, vaikuttavuuden ja kustannusvaikuttavuuden jatkuvassa seurannassa alueellisissa sosiaali- ja terveyspalveluissa ja testata mallia esimerkkitapauksilla. Pääpaino on vaikuttavuudessa ja kustannusvaikuttavuudessa. Aiemman tutkimuksen perusteella tuottavuuden, vaikuttavuuden ja kustannusvaikuttavuuden mittaamiseen on useita lähestymistapoja. Tässä tutkimuksessa tuottavuutta arvioidaan panosten ja tuotosten suhteella, vaikuttavuutta palvelujen käytöllä ja kustannusvaikuttavuutta palvelujen käytön kustannuksilla. Kirjallisuudesta nousee esille selkeä tarve yli perinteisten organisaatiorajojen menevälle jatkuvalle vaikuttavuuden seurannalle. Aikaisempi tutkimus kattaa kertaluontoisia selvityksiä, joissa usein mittarit ovat operatiivisella tasolla, toimialasidonnaisia tai vaikeasti mitattavissa. Tutkimus on suunnittelutiedettä. Tutkimuksen lopputuloksena syntyy sosiaali- ja terveyspalvelujen käyttömalli (sote-palvelujen käyttömalli), jota varten toteutetaan tietokanta ja raportointikerros. Sote-palvelujen käyttömallia testataan tässä tutkimuksessa kolmella eri organisaatioyksiköllä ja asiakasryhmällä, jotka ovat strategisesti merkittäviä tutkittavalle organisaatiolle ja asiakasryhmiin on kohdennettu selkeä kehittämistoimenpide (kuntoutukseen panostaminen ja vammaisten sekä vanhusten laitoshoidon purku). Sote-palvelujen käyttömalli tuottaa tietoa kehittämistoimenpiteiden tuottavuudesta, vaikuttavuudesta ja kustannusvaikuttavuudesta. Sote-palvelujen käyttömallin todetaan soveltuvan tutkittavaan organisaatioon ja sote-palvelujen käyttömalli on sielläjatkuvassa käytössä. Sote-palvelujen käyttömalli on siirrettävissä myös muihin soteorganisaatioihin ja laajennettavissa myös muihin sosiaali- ja terveyspalveluihin ja niitä lähellä oleviin palveluihin.
Resumo:
Building Integrated Photovoltaics (BIPV) are considered as the future of photovoltaic (PV) technology. The advantage of BIPV system is its multi-functionality; they fulfil the functions of a building envelope with the added benefit of generating power by replacing the traditional roofing and façade materials with PV that generate power. In this thesis, different types of PV cells and modules have been described in detail with their efficiencies and usage trends in the last decade. The different BIPV products for roof and façade are discussed in detail giving several examples. The electricity generation potential of BIPV in selected countries is compared with their actual electricity consumption. Further, the avoided greenhouse gas (GHG) emissions associated with electricity generation from traditional sources and transportation and distribution (T&D) losses are calculated. The results illustrate huge savings in GHGs. In BIPV different types of façade and backsheets are used. In this thesis, selected backsheets and façade were characterized in terms of their surface structure identification using infrared spectroscopy (FTIR-ATR), scanning electron microscopy with energy dispersive X-ray (SEM-EDX) and physical characterization using surface energy measurements. By using FTIR-ATR, surface polymeric materials were identified and with SEM-EDX, identification of the surface elements was possible. Surface energy measurements were useful in finding the adhesives and knowing the surface energies of the various backsheets and façade. The strength of adhesion between the facade and backsheets was studied using peel test. Four different types of adhesives were used to study the fracture pattern and peel tests values to identify the most suitable adhesive. It was found out that pretreatment increased the adhesive strength significantly.
Resumo:
There are more than 7000 languages in the world, and many of these have emerged through linguistic divergence. While questions related to the drivers of linguistic diversity have been studied before, including studies with quantitative methods, there is no consensus as to which factors drive linguistic divergence, and how. In the thesis, I have studied linguistic divergence with a multidisciplinary approach, applying the framework and quantitative methods of evolutionary biology to language data. With quantitative methods, large datasets may be analyzed objectively, while approaches from evolutionary biology make it possible to revisit old questions (related to, for example, the shape of the phylogeny) with new methods, and adopt novel perspectives to pose novel questions. My chief focus was on the effects exerted on the speakers of a language by environmental and cultural factors. My approach was thus an ecological one, in the sense that I was interested in how the local environment affects humans and whether this human-environment connection plays a possible role in the divergence process. I studied this question in relation to the Uralic language family and to the dialects of Finnish, thus covering two different levels of divergence. However, as the Uralic languages have not previously been studied using quantitative phylogenetic methods, nor have population genetic methods been previously applied to any dialect data, I first evaluated the applicability of these biological methods to language data. I found the biological methodology to be applicable to language data, as my results were rather similar to traditional views as to both the shape of the Uralic phylogeny and the division of Finnish dialects. I also found environmental conditions, or changes in them, to be plausible inducers of linguistic divergence: whether in the first steps in the divergence process, i.e. dialect divergence, or on a large scale with the entire language family. My findings concerning Finnish dialects led me to conclude that the functional connection between linguistic divergence and environmental conditions may arise through human cultural adaptation to varying environmental conditions. This is also one possible explanation on the scale of the Uralic language family as a whole. The results of the thesis bring insights on several different issues in both a local and a global context. First, they shed light on the emergence of the Finnish dialects. If the approach used in the thesis is applied to the dialects of other languages, broader generalizations may be drawn as to the inducers of linguistic divergence. This again brings us closer to understanding the global patterns of linguistic diversity. Secondly, the quantitative phylogeny of the Uralic languages, with estimated times of language divergences, yields another hypothesis as to the shape and age of the language family tree. In addition, the Uralic languages can now be added to the growing list of language families studied with quantitative methods. This will allow broader inferences as to global patterns of language evolution, and more language families can be included in constructing the tree of the world’s languages. Studying history through language, however, is only one way to illuminate the human past. Therefore, thirdly, the findings of the thesis, when combined with studies of other language families, and those for example in genetics and archaeology, bring us again closer to an understanding of human history.