994 resultados para component architecture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plasticized poly(vinyl chloride) (pPVC), although a major player in the medical field, is at present facing lot of criticism due to some of its limitations like the leaching out of the toxic plasticizer, di ethylhexyl phthalate (DEHP) to the medium and the emission of an environmental pollutant,dioxin gas,at the time of the post use disposal of PVC Products by incineration. Due to these reasons, efforts are on to reduce the use of pPVC considerably in the medical field and to find viable alternative materials. The present study has been undertaken in this context to find a suitable material for the manufacture of medical aids in place of pPVC. The main focus of this study has been to find out a non-DEHP material as plasticizer for pPVC and another suitable material for the complete repalcement of pPVC for blood/ blood component storage applications.Two approaches have been undertaken for this purpose-(1)the controversial plasticizer, DEHP has been partially replaced by polymeric plasticizers(2) an alternative material, namely, metallocene polyolefin (mPO) has been used and suitably modified to match the properties of flexible PVC used for blood and blood component storage applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The laser produced plasma from the multi-component target YBa2CU3O7 was analyzed using Michelson interferometry and time resolved emission spectroscopy. The interaction of 10 ns pulses of 1.06 mum radiation from a Q-switched Nd:YAG laser at laser power densities ranging from 0.55 GW cm-2 to 1.5 GW cm-2 has been studied. Time resolved spectral measurements of the plasma evolution show distinct features at different points in its temporal history. For a time duration of less than 55 ns after the laser pulse (for a typical laser power density of 0.8 GW cm-2, the emission spectrum is dominated by black-body radiation. During cooling after 55 ns the spectral emission consists mainly of neutral and ionic species. Line averaged electron densities were deduced from interferometric line intensity measurements at various laser power densities. Plasma electron densities are of the order of 1017 cm-3 and the plasma temperature at the core region is about 1 eV. The measurement of plasma emission line intensities of various ions inside the plasma gave evidence of multiphoton ionization of the elements constituting the target at low laser power densities. At higher laser power densities the ionization mechanism is collision dominated. For elements such as nitrogen present outside the target, ionization is due to collisions only.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Department of Mathematics, Cochin University of Science and Technology

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-component reactions are effective in building complex molecules in a single step in a minimum amount of time and with facile isolation procedures; they have high economy1–7 and thus have become a powerful synthetic strategy in recent years.8–10 The multicomponent protocols are even more attractive when carried out in aqueous medium. Water offers several benefits, including control over exothermicity, and the isolation of products can be carried out by single phase separation technique. Pyranopyrazoles are a biologically important class of heterocyclic compounds and in particular dihydropyrano[2,3-c]pyrazoles play an essential role in promoting biological activity and represent an interesting template in medicinal chemistry. Heterocyclic compounds bearing the 4-H pyran unit have received much attention in recent years as they constitute important precursors for promising drugs.11–13 Pyrano[2,3-c]pyrazoles exhibit analgesic,14 anti-cancer,15 anti-microbial and anti-inflammatory16 activity. Furthermore dihydropyrano[2,3-c]pyrazoles show molluscidal activity17,18 and are used in a screening kit for Chk 1 kinase inhibitor activity.19,20 They also find applications as pharmaceutical ingredients and bio-degradable agrochemicals.21–29 Junek and Aigner30 first reported the synthesis of pyrano[2,3-c]pyrazole derivatives from 3-methyl-1-phenylpyrazolin-5-one and tetracyanoethylene in the presence of triethylamine. Subsequently, a number of synthetic approaches such as the use of triethylamine,31 piperazine,32 piperidine,33 N-methylmorpholine in ethanol,34 microwave irradiation,35,36 solvent-free conditions,37–39 cyclodextrins (CDs),40 different bases in water,41 γ -alumina,42 and l-proline43 have been reported for the synthesis of 6-amino-4-alkyl/aryl-3-methyl- 2,4-dihydropyrano[2,3-c]pyrazole-5-carbonitriles. Recently, tetraethylammonium bromide (TEABr) has emerged as mild, water-tolerant, eco-friendly and inexpensive catalyst. To the best of our knowledge, quaternary ammonium salts, more specifically TEABr, have notbeen used as catalysts for the synthesis of pyrano[2,3-c]pyrazoles, and we decided to investigate the application of TEABr as a catalyst for the synthesis of a series of pyrazole-fused pyran derivatives via multi-component reactions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we have evolved a generic software architecture for a domain specific distributed embedded system. The system under consideration belongs to the Command, Control and Communication systems domain. The systems in such domain have very long operational lifetime. The quality attributes of these systems are equally important as the functional requirements. The main guiding principle followed in this paper for evolving the software architecture has been functional independence of the modules. The quality attributes considered most important for the system are maintainability and modifiability. Architectural styles best suited for the functionally independent modules are proposed with focus on these quality attributes. The software architecture for the system is envisioned as a collection of architecture styles of the functionally independent modules identified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A spectral angle based feature extraction method, Spectral Clustering Independent Component Analysis (SC-ICA), is proposed in this work to improve the brain tissue classification from Magnetic Resonance Images (MRI). SC-ICA provides equal priority to global and local features; thereby it tries to resolve the inefficiency of conventional approaches in abnormal tissue extraction. First, input multispectral MRI is divided into different clusters by a spectral distance based clustering. Then, Independent Component Analysis (ICA) is applied on the clustered data, in conjunction with Support Vector Machines (SVM) for brain tissue analysis. Normal and abnormal datasets, consisting of real and synthetic T1-weighted, T2-weighted and proton density/fluid-attenuated inversion recovery images, were used to evaluate the performance of the new method. Comparative analysis with ICA based SVM and other conventional classifiers established the stability and efficiency of SC-ICA based classification, especially in reproduction of small abnormalities. Clinical abnormal case analysis demonstrated it through the highest Tanimoto Index/accuracy values, 0.75/98.8%, observed against ICA based SVM results, 0.17/96.1%, for reproduced lesions. Experimental results recommend the proposed method as a promising approach in clinical and pathological studies of brain diseases

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we propose a multispectral analysis system using wavelet based Principal Component Analysis (PCA), to improve the brain tissue classification from MRI images. Global transforms like PCA often neglects significant small abnormality details, while dealing with a massive amount of multispectral data. In order to resolve this issue, input dataset is expanded by detail coefficients from multisignal wavelet analysis. Then, PCA is applied on the new dataset to perform feature analysis. Finally, an unsupervised classification with Fuzzy C-Means clustering algorithm is used to measure the improvement in reproducibility and accuracy of the results. A detailed comparative analysis of classified tissues with those from conventional PCA is also carried out. Proposed method yielded good improvement in classification of small abnormalities with high sensitivity/accuracy values, 98.9/98.3, for clinical analysis. Experimental results from synthetic and clinical data recommend the new method as a promising approach in brain tissue analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multispectral analysis is a promising approach in tissue classification and abnormality detection from Magnetic Resonance (MR) images. But instability in accuracy and reproducibility of the classification results from conventional techniques keeps it far from clinical applications. Recent studies proposed Independent Component Analysis (ICA) as an effective method for source signals separation from multispectral MR data. However, it often fails to extract the local features like small abnormalities, especially from dependent real data. A multisignal wavelet analysis prior to ICA is proposed in this work to resolve these issues. Best de-correlated detail coefficients are combined with input images to give better classification results. Performance improvement of the proposed method over conventional ICA is effectively demonstrated by segmentation and classification using k-means clustering. Experimental results from synthetic and real data strongly confirm the positive effect of the new method with an improved Tanimoto index/Sensitivity values, 0.884/93.605, for reproduced small white matter lesions

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Speech is the primary, most prominent and convenient means of communication in audible language. Through speech, people can express their thoughts, feelings or perceptions by the articulation of words. Human speech is a complex signal which is non stationary in nature. It consists of immensely rich information about the words spoken, accent, attitude of the speaker, expression, intention, sex, emotion as well as style. The main objective of Automatic Speech Recognition (ASR) is to identify whatever people speak by means of computer algorithms. This enables people to communicate with a computer in a natural spoken language. Automatic recognition of speech by machines has been one of the most exciting, significant and challenging areas of research in the field of signal processing over the past five to six decades. Despite the developments and intensive research done in this area, the performance of ASR is still lower than that of speech recognition by humans and is yet to achieve a completely reliable performance level. The main objective of this thesis is to develop an efficient speech recognition system for recognising speaker independent isolated words in Malayalam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermoaktive Bauteilsysteme sind Bauteile, die als Teil der Raumumschließungsflächen über ein integriertes Rohrsystem mit einem Heiz- oder Kühlmedium beaufschlagt werden können und so die Beheizung oder Kühlung des Raumes ermöglichen. Die Konstruktionenvielfalt reicht nach diesem Verständnis von Heiz, bzw. Kühldecken über Geschoßtrenndecken mit kern-integrierten Rohren bis hin zu den Fußbodenheizungen. Die darin enthaltenen extrem trägen Systeme werden bewußt eingesetzt, um Energieangebot und Raumenergiebedarf unter dem Aspekt der rationellen Energieanwendung zeitlich zu entkoppeln, z. B. aktive Bauteilkühlung in der Nacht, passive Raumkühlung über das kühle Bauteil am Tage. Gebäude- und Anlagenkonzepte, die träge reagierende thermoaktive Bauteilsysteme vorsehen, setzen im kompetenten und verantwortungsvollen Planungsprozeß den Einsatz moderner Gebäudesimulationswerkzeuge voraus, um fundierte Aussagen über Behaglichkeit und Energiebedarf treffen zu können. Die thermoaktiven Bauteilsysteme werden innerhalb dieser Werkzeuge durch Berechnungskomponenten repräsentiert, die auf mathematisch-physikalischen Modellen basieren und zur Lösung des bauteilimmanenten mehrdimensionalen instationären Wärmeleitungsproblems dienen. Bisher standen hierfür zwei unterschiedliche prinzipielle Vorgehensweisen zur Lösung zur Verfügung, die der physikalischen Modellbildung entstammen und Grenzen bzgl. abbildbarer Geometrie oder Rechengeschwindigkeit setzen. Die vorliegende Arbeit dokumentiert eine neue Herangehensweise, die als experimentelle Modellbildung bezeichnet wird. Über den Weg der Systemidentifikation können aus experimentell ermittelten Datenreihen die Parameter für ein kompaktes Black-Box-Modell bestimmt werden, das das Eingangs-Ausgangsverhalten des zugehörigen beliebig aufgebauten thermoaktiven Bauteils mit hinreichender Genauigkeit widergibt. Die Meßdatenreihen lassen sich über hochgenaue Berechnungen generieren, die auf Grund ihrer Detailtreue für den unmittelbaren Einsatz in der Gebäudesimulation ungeeignet wären. Die Anwendung der Systemidentifikation auf das zweidimensionale Wärmeleitungsproblem und der Nachweis ihrer Eignung wird an Hand von sechs sehr unterschiedlichen Aufbauten thermoaktiver Bauteilsysteme durchgeführt und bestätigt sehr geringe Temperatur- und Energiebilanzfehler. Vergleiche zwischen via Systemidentifikation ermittelten Black-Box-Modellen und physikalischen Modellen für zwei Fußbodenkonstruktionen zeigen, daß erstgenannte auch als Referenz für Genauigkeitsabschätzungen herangezogen werden können. Die Praktikabilität des neuen Modellierungsansatzes wird an Fallstudien demonstriert, die Ganzjahressimulationen unter Bauteil- und Betriebsvariationen an einem exemplarischen Büroraum betreffen. Dazu erfolgt die Integration des Black-Box-Modells in das kommerzielle Gebäude- und Anlagensimulationsprogramm CARNOT. Die akzeptablen Rechenzeiten für ein Einzonen-Gebäudemodell in Verbindung mit den hohen Genauigkeiten bescheinigen die Eignung der neuen Modellierungsweise.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ein Luft-Erdwärmetauscher (L-EWT) kommt wegen seines niedrigen Energiebedarfs und möglicher guter Aufwandszahlen als umweltfreundliche Versorgungskomponente für Gebäude in Betracht. Dabei ist besonders vorteilhaft, dass ein L-EWT die Umgebungsluft je nach Jahreszeit vorwärmen oder auch kühlen kann. Dem zufolge sind L-EWT zur Energieeinsparung nicht nur für den Wohnhausbau interessant, sondern auch dort, wo immer noch große Mengen an fossiler Energie für die Raumkühlung benötigt werden, im Büro- und Produktionsgebäudesektor. Der Einsatzbereich eines L-EWT liegt zwischen Volumenströmen von 100 m3/h und mehreren 100.000 m3/h. Aus dieser Bandbreite und den instationären Randbedingungen entstehen erhebliche Schwierigkeiten, allgemeingültige Aussagen über das zu erwartende thermische Systemverhalten aus der Vielzahl möglicher Konstruktionsvarianten zu treffen. Hauptziel dieser Arbeit ist es, auf Basis umfangreicher, mehrjähriger Messungen an einer eigens konzipierten Testanlage und eines speziell angepassten numerischen Rechenmodells, Kennzahlen zu entwickeln, die es ermöglichen, die Betriebseigenschaften eines L-EWT im Planungsalltag zu bestimmen und ein technisch, ökologisch wie ökonomisch effizientes System zu identifizieren. Es werden die Kennzahlen elewt (Aufwandszahl), QV (Netto-Volumenleistung), ME (Meterertrag), sowie die Kombination aus v (Strömungsgeschwindigkeit) und VL (Metervolumenstrom) definiert, die zu wichtigen Informationen führen, mit denen die Qualität von Systemvarianten in der Planungsphase bewertet werden können. Weiterführende Erkenntnisse über die genauere Abschätzung von Bodenkennwerten werden dargestellt. Die hygienische Situation der durch den L-EWT transportierten Luft wird für die warme Jahreszeit, aufgrund auftretender Tauwasserbildung, beschrieben. Aus diesem Grund werden alle relevanten lufthygienischen Parameter in mehreren aufwendigen Messkampagnen erfasst und auf pathogene Wirkungen überprüft. Es wird über Sensitivitätsanalysen gezeigt, welche Fehler bei Annahme falscher Randbedingungen eintreten. Weiterhin werden in dieser Arbeit wesentliche, grundsätzliche Erkenntnisse aufbereitet, die sich aus der Betriebsbeobachtung und der Auswertung der umfangreich vorliegenden Messdaten mehrerer Anlagen ergeben haben und für die praktische Umsetzung und die Betriebsführung bedeutend sind. Hinweise zu Materialeigenschaften und zur Systemwirtschaftlichkeit sind detailliert aufgeführt.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Globalization is widely regarded as the rise of the borderless world. However in practice, true globalization points rather to a “spatial logic” by which globalization is manifested locally in the shape of insular space. Globalization in this sense is not merely about the creation of physical fragmentation of space but also the creation of social disintegration. This study tries to proof that global processes also create various forms of insular space leading also to specific social implications. In order to examine the problem this study looks at two cases: China’s Pearl River Delta (PRD) and Jakarta in Indonesia. The PRD case reveals three forms of insular space namely the modular, concealed and the hierarchical. The modular points to the form of enclosed factories where workers are vulnerable for human-right violations due to the absent of public control. The concealed refers to the production of insular space by subtle discrimination against certain social groups in urban space. And the hierarchical points to a production of insular space that is formed by an imbalanced population flow. The Jakarta case attempts to show more types of insularity in relation to the complexity of a mega-city which is shaped by a culture of exclusion. Those are dormant and hollow insularity. The dormant refers to the genesis of insular– radical – community from a culture of resistance. The last type, the hollow, points to the process of making a “pseudo community” where sense of community is not really developed as well as weak social relationship with its surrounding. Although global process creates various expressions of territorial insularization, however, this study finds that the “line of flight” is always present, where the border of insularity is crossed. The PRD’s produces vernacular modernization done by peasants which is less likely to be controlled by the politics of insularization. In Jakarta, the culture of insularization causes urban informalities that have no space, neither spatially nor socially; hence their state of ephemerality continues as a tactic of place-making. This study argues that these crossings possess the potential for reconciling venue to defuse the power of insularity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------