909 resultados para Concept-based Terminology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Brazil is expected to have 19.6 million patients with diabetes by the year 2030. A key concept in the treatment of type 2 diabetes mellitus (T2DM) is establishing individualized glycemic goals based on each patient’s clinical characteristics, which impact the choice of antihyperglycemic therapy. Targets for glycemic control, including fasting blood glucose, postprandial blood glucose, and glycated hemoglobin (A1C), are often not reached solely with antihyperglycemic therapy, and insulin therapy is often required. Basal insulin is considered an initial strategy; however, premixed insulins are convenient and are equally or more effective, especially for patients who require both basal and prandial control but desire a more simplified strategy involving fewer daily injections than a basal-bolus regimen. Most physicians are reluctant to transition patients to insulin treatment due to inappropriate assumptions and insufficient information. We conducted a nonsystematic review in PubMed and identified the most relevant and recently published articles that compared the use of premixed insulin versus basal insulin analogues used alone or in combination with rapid-acting insulin analogues before meals in patients with T2DM. These studies suggest that premixed insulin analogues are equally or more effective in reducing A1C compared to basal insulin analogues alone in spite of the small increase in the risk of nonsevere hypoglycemic events and nonclinically significant weight gain. Premixed insulin analogues can be used in insulin-naïve patients, in patients already on basal insulin therapy, and those using basal-bolus therapy who are noncompliant with blood glucose self-monitoring and titration of multiple insulin doses. We additionally provide practical aspects related to titration for the specific premixed insulin analogue formulations commercially available in Brazil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

At present, solid thin films are recognized by their well established and mature processing technology that is able to produce components which, depending on their main characteristics, can perform either passive or active functions. Additionally, Si-based materials in the form of thin films perfectly match the concept of miniaturized and low-consumption devices-as required in various modern technological applications. Part of these aspects was considered in the present work that was concerned with the study of optical micro-cavities entirely based on silicon and silicon nitride thin films. The structures were prepared by the sputtering deposition method which, due to the adopted conditions (atmosphere and deposition rate) and arrangement of layers, provided cavities operating either in the visible (at ~ 670 nm) or in the near-infrared (at ~ 1560 nm) wavelength ranges. The main differential of the work relies on the construction of optical microcavities with a reduced number of periods whose main properties can be changed by thermal annealing treatments. The work also discusses the angle-dependent behavior of the optical transmission profiles as well as the use of the COMSOL software package to simulate the microcavities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic business surely represents the new development perspective for world-wide trade. Together with the idea of ebusiness, and the exigency to exchange business messages between trading partners, the concept of business-to-business (B2B) integration arouse. B2B integration is becoming necessary to allow partners to communicate and exchange business documents, like catalogues, purchase orders, reports and invoices, overcoming architectural, applicative, and semantic differences, according to the business processes implemented by each enterprise. Business relationships can be very heterogeneous, and consequently there are variousways to integrate enterprises with each other. Moreover nowadays not only large enterprises, but also the small- and medium- enterprises are moving towards ebusiness: more than two-thirds of Small and Medium Enterprises (SMEs) use the Internet as a business tool. One of the business areas which is actively facing the interoperability problem is that related with the supply chain management. In order to really allow the SMEs to improve their business and to fully exploit ICT technologies in their business transactions, there are three main players that must be considered and joined: the new emerging ICT technologies, the scenario and the requirements of the enterprises and the world of standards and standardisation bodies. This thesis presents the definition and the development of an interoperability framework (and the bounded standardisation intiatives) to provide the Textile/Clothing sectorwith a shared set of business documents and protocols for electronic transactions. Considering also some limitations, the thesis proposes a ontology-based approach to improve the functionalities of the developed framework and, exploiting the technologies of the semantic web, to improve the standardisation life-cycle, intended as the development, dissemination and adoption of B2B protocols for specific business domain. The use of ontologies allows the semantic modellisation of knowledge domains, upon which it is possible to develop a set of components for a better management of B2B protocols, and to ease their comprehension and adoption for the target users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Linguistic immersion programs are increasing nowadays. The concept of being bilingual, that started being used by schools for the elite in the 19th century, and that in the mid- 20th century became an educational option, has given raise to CLIL (Content and Language Integrated Learning), a methodology through which students work in a bilingual environment, acquiring knowledge of curricular subject and developing their competences in a foreign language. In this teaching context started a new European project called PlayingCLIL. Six partners from different European countries (Germany, United Kingdom, Spain and Romania) are working in this project. Our main aim is to develop a new methodology to learn a foreign language combining elements from the pedagogic drama (interactive games) with the CLIL classroom. At present we are testing the games in different schools and high schools and we are compiling the results to be collected in a handbook (printed and e-book).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and aims: Sorafenib is the reference therapy for advanced Hepatocellular Carcinoma (HCC). No method exists to predict in the very early period subsequent individual response. Starting from the clinical experience in humans that subcutaneous metastases may rapidly change consistency under sorafenib and that elastosonography a new ultrasound based technique allows assessment of tissue stiffness, we investigated the role of elastonography in the very early prediction of tumor response to sorafenib in a HCC animal model. Methods: HCC (Huh7 cells) subcutaneous xenografting in mice was utilized. Mice were randomized to vehicle or treatment with sorafenib when tumor size was 5-10 mm. Elastosonography (Mylab 70XVG, Esaote, Genova, Italy) of the whole tumor mass on a sagittal plane with a 10 MHz linear transducer was performed at different time points from treatment start (day 0, +2, +4, +7 and +14) until mice were sacrified (day +14), with the operator blind to treatment. In order to overcome variability in absolute elasticity measurement when assessing changes over time, values were expressed in arbitrary units as relative stiffness of the tumor tissue in comparison to the stiffness of a standard reference stand-off pad lying on the skin over the tumor. Results: Sor-treated mice showed a smaller tumor size increase at day +14 in comparison to vehicle-treated (tumor volume increase +192.76% vs +747.56%, p=0.06). Among Sor-treated tumors, 6 mice showed a better response to treatment than the other 4 (increase in volume +177% vs +553%, p=0.011). At day +2, median tumor elasticity increased in Sor-treated group (+6.69%, range –30.17-+58.51%), while decreased in the vehicle group (-3.19%, range –53.32-+37.94%) leading to a significant difference in absolute values (p=0.034). From this time point onward, elasticity decreased in both groups, with similar speed over time, not being statistically different anymore. In Sor-treated mice all 6 best responders at day 14 showed an increase in elasticity at day +2 (ranging from +3.30% to +58.51%) in comparison to baseline, whereas 3 of the 4 poorer responders showed a decrease. Interestingly, these 3 tumours showed elasticity values higher than responder tumours at day 0. Conclusions: Elastosonography appears a promising non-invasive new technique for the early prediction of HCC tumor response to sorafenib. Indeed, we proved that responder tumours are characterized by an early increase in elasticity. The possibility to distinguish a priori between responders and non responders based on the higher elasticity of the latter needs to be validated in ad-hoc experiments as well as a confirmation of our results in humans is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The "sustainability" concept relates to the prolonging of human economic systems with as little detrimental impact on ecological systems as possible. Construction that exhibits good environmental stewardship and practices that conserve resources in a manner that allow growth and development to be sustained for the long-term without degrading the environment are indispensable in a developed society. Past, current and future advancements in asphalt as an environmentally sustainable paving material are especially important because the quantities of asphalt used annually in Europe as well as in the U.S. are large. The asphalt industry is still developing technological improvements that will reduce the environmental impact without affecting the final mechanical performance. Warm mix asphalt (WMA) is a type of asphalt mix requiring lower production temperatures compared to hot mix asphalt (HMA), while aiming to maintain the desired post construction properties of traditional HMA. Lowering the production temperature reduce the fuel usage and the production of emissions therefore and that improve conditions for workers and supports the sustainable development. Even the crumb-rubber modifier (CRM), with shredded automobile tires and used in the United States since the mid 1980s, has proven to be an environmentally friendly alternative to conventional asphalt pavement. Furthermore, the use of waste tires is not only relevant in an environmental aspect but also for the engineering properties of asphalt [Pennisi E., 1992]. This research project is aimed to demonstrate the dual value of these Asphalt Mixes in regards to the environmental and mechanical performance and to suggest a low environmental impact design procedure. In fact, the use of eco-friendly materials is the first phase towards an eco-compatible design but it cannot be the only step. The eco-compatible approach should be extended also to the design method and material characterization because only with these phases is it possible to exploit the maximum potential properties of the used materials. Appropriate asphalt concrete characterization is essential and vital for realistic performance prediction of asphalt concrete pavements. Volumetric (Mix design) and mechanical (Permanent deformation and Fatigue performance) properties are important factors to consider. Moreover, an advanced and efficient design method is necessary in order to correctly use the material. A design method such as a Mechanistic-Empirical approach, consisting of a structural model capable of predicting the state of stresses and strains within the pavement structure under the different traffic and environmental conditions, was the application of choice. In particular this study focus on the CalME and its Incremental-Recursive (I-R) procedure, based on damage models for fatigue and permanent shear strain related to the surface cracking and to the rutting respectively. It works in increments of time and, using the output from one increment, recursively, as input to the next increment, predicts the pavement conditions in terms of layer moduli, fatigue cracking, rutting and roughness. This software procedure was adopted in order to verify the mechanical properties of the study mixes and the reciprocal relationship between surface layer and pavement structure in terms of fatigue and permanent deformation with defined traffic and environmental conditions. The asphalt mixes studied were used in a pavement structure as surface layer of 60 mm thickness. The performance of the pavement was compared to the performance of the same pavement structure where different kinds of asphalt concrete were used as surface layer. In comparison to a conventional asphalt concrete, three eco-friendly materials, two warm mix asphalt and a rubberized asphalt concrete, were analyzed. The First Two Chapters summarize the necessary steps aimed to satisfy the sustainable pavement design procedure. In Chapter I the problem of asphalt pavement eco-compatible design was introduced. The low environmental impact materials such as the Warm Mix Asphalt and the Rubberized Asphalt Concrete were described in detail. In addition the value of a rational asphalt pavement design method was discussed. Chapter II underlines the importance of a deep laboratory characterization based on appropriate materials selection and performance evaluation. In Chapter III, CalME is introduced trough a specific explanation of the different equipped design approaches and specifically explaining the I-R procedure. In Chapter IV, the experimental program is presented with a explanation of test laboratory devices adopted. The Fatigue and Rutting performances of the study mixes are shown respectively in Chapter V and VI. Through these laboratory test data the CalME I-R models parameters for Master Curve, fatigue damage and permanent shear strain were evaluated. Lastly, in Chapter VII, the results of the asphalt pavement structures simulations with different surface layers were reported. For each pavement structure, the total surface cracking, the total rutting, the fatigue damage and the rutting depth in each bound layer were analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of oligo-phenylene dendronised conjugated polymers was prepared. The divergent synthetic approach adopted allowed for the facile synthesis of a range of dendronised monomers from a common intermediate, e.g. first and second generation fluorene. Only the polymerisation of the first generation and alkylarylamine substituted dendronised fluorene monomers yielded high molecular weight materials, attributed to the low solubility of the remaining dendronised monomers. The alkylarylamine substituted dendronised poly(fluorene) was incorporated into an organic light emitting diode (OLED) and exhibited an increased colour stability in air compared to other poly(fluorenes). The concept of dendronisation was extended to poly(fluorenone), a previously insoluble material. The synthesis of the first soluble poly(fluorenone) was achieved by the incorporation of oligo-phenylene dendrons at the 4-position of fluorenone. The dendronisation of fluorenone allowed for a polymer with an Mn of 4.1 x 104 gmol-1 to be prepared. Cyclic voltammetry of the dendronised poly(fluorenone) showed that the electron affinity of the polymer was high and that the polymer is a promising n-type material. A dimer and trimer of indenofluorene (IF) were prepared from the monobromo IF. These oligomers were investigated by 2-dimensional wide angle x-ray spectroscopy (2D-WAXS), polarised optical microscopy (POM) and dielectric spectroscopy, and found to form highly ordered smetic phases. By attaching perylene dye as the end-capper on the IF oligomers, molecules that exhibited efficient Förster energy transfer were obtained. Indenofluorene monoketone, a potential defect structure for IF based OLED’s, was synthesised. The synthesis of this model defect structure allowed for the long wavelength emission in OLED’s to be identified as ketone defects. The long wavelength emission from the indenofluorene monoketone was found to be concentration dependent, and suggests that aggregate formation is occurring. An IF linked hexa-peri-hexabenzocoronene (HBC) dimer was synthesised. The 2D-WAXS images of this HBC dimer demonstrate that the molecule exhibits intercolumnar organisation perpendicular to the extrusion direction. POM images of mixtures of the HBC dimer mixed with an HBC with a low isotropic temperature demonstrated that the HBC dimer is mixing with the isotropic HBC.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il progetto di ricerca che presentiamo nasce dalla virtuosa combinazione di teoria e prassi didattica nello spirito della ricerca-azione. Scopo del presente lavoro è elaborare un percorso didattico di formazione alla traduzione specializzata in ambito medico-scientifico, tecnico ed economico-giuridico per la combinazione linguistica spagnolo-italiano all’interno della cornice istituzionale concreta dell’università italiana oggi. La nostra proposta formativa si fonda su tre elementi: la ricognizione del mercato attuale della traduzione per la combinazione linguistica indicata, l’individuazione degli obiettivi formativi in base al modello di competenza traduttiva scelto, l’elaborazione del percorso didattico per competenze e basato sull’enfoque por tareas di traduzione. Nella progettazione delle modalità didattiche due sono gli aspetti che definiscono il percorso proposto: il concetto di genere testuale specializzato per la traduzione e la gestione delle informazioni mediante le nuove tecnologie (corpora, banche dati terminologiche e fraseologiche, memorie di traduzione, traduzione controllata). Il presente lavoro si articola in due parti: la prima parte (quattro capitoli) presenta l’inquadramento teorico all’interno del quale si sviluppa la riflessione intorno alla didattica della traduzione specializzata; la seconda parte (due capitoli) presenta l’inquadramento metodologico e analitico all’interno del quale si elabora la nostra proposta didattica. Nel primo capitolo si illustrano i rapporti fra traduzione e mondo professionale; nel secondo capitolo si presenta il concetto di competenza traduttiva come ponte tra la formazione e il mondo della traduzione professionale; nel terzo capitolo si ripercorrono le tappe principali dell’evoluzione della didattica della traduzione generale; nel quarto capitolo illustriamo alcune tra le più recenti e complete proposte didattiche per la traduzione specializzata in ambito tecnico, medico-scientifico ed economico-giuridico. Nel quinto capitolo si introduce il concetto di genere testuale specializzato per la traduzione e nel sesto capitolo si illustra la proposta didattica per la traduzione specializzata dallo spagnolo in italiano che ha motivato il presente lavoro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of competitiveness, for a long time considered as strictly connected to economic and financial performances, evolved, above all in recent years, toward new, wider interpretations disclosing its multidimensional nature. The shift to a multidimensional view of the phenomenon has excited an intense debate involving theoretical reflections on the features characterizing it, as well as methodological considerations on its assessment and measurement. The present research has a twofold objective: going in depth with the study of tangible and intangible aspect characterizing multidimensional competitive phenomena by assuming a micro-level point of view, and measuring competitiveness through a model-based approach. Specifically, we propose a non-parametric approach to Structural Equation Models techniques for the computation of multidimensional composite measures. Structural Equation Models tools will be used for the development of the empirical application on the italian case: a model based micro-level competitiveness indicator for the measurement of the phenomenon on a large sample of Italian small and medium enterprises will be constructed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last 60 years, computers and software have favoured incredible advancements in every field. Nowadays, however, these systems are so complicated that it is difficult – if not challenging – to understand whether they meet some requirement or are able to show some desired behaviour or property. This dissertation introduces a Just-In-Time (JIT) a posteriori approach to perform the conformance check to identify any deviation from the desired behaviour as soon as possible, and possibly apply some corrections. The declarative framework that implements our approach – entirely developed on the promising open source forward-chaining Production Rule System (PRS) named Drools – consists of three components: 1. a monitoring module based on a novel, efficient implementation of Event Calculus (EC), 2. a general purpose hybrid reasoning module (the first of its genre) merging temporal, semantic, fuzzy and rule-based reasoning, 3. a logic formalism based on the concept of expectations introducing Event-Condition-Expectation rules (ECE-rules) to assess the global conformance of a system. The framework is also accompanied by an optional module that provides Probabilistic Inductive Logic Programming (PILP). By shifting the conformance check from after execution to just in time, this approach combines the advantages of many a posteriori and a priori methods proposed in literature. Quite remarkably, if the corrective actions are explicitly given, the reactive nature of this methodology allows to reconcile any deviations from the desired behaviour as soon as it is detected. In conclusion, the proposed methodology brings some advancements to solve the problem of the conformance checking, helping to fill the gap between humans and the increasingly complex technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a universal model of documents and deltas. This model formalize what it means to find differences between documents and to shows a single shared formalization that can be used by any algorithm to describe the differences found between any kind of comparable documents. The main scientific contribution of this thesis is a universal delta model that can be used to represent the changes found by an algorithm. The main part of this model are the formal definition of changes (the pieces of information that records that something has changed), operations (the definitions of the kind of change that happened) and deltas (coherent summaries of what has changed between two documents). The fundamental mechanism tha makes the universal delta model a very expressive tool is the use of encapsulation relations between changes. In the universal delta model, changes are not always simple records of what has changed, they can also be combined into more complex changes that reflects the detection of more meaningful modifications. In addition to the main entities (i.e., changes, operations and deltas), the model describes and defines also documents and the concept of equivalence between documents. As a corollary to the model, there is also an extensible catalog of possible operations that algorithms can detect, used to create a common library of operations, and an UML serialization of the model, useful as a reference when implementing APIs that deal with deltas. The universal delta model presented in this thesis acts as the formal groundwork upon which algorithm can be based and libraries can be implemented. It removes the need to recreate a new delta model and terminology whenever a new algorithm is devised. It also alleviates the problems that toolmakers have when adapting their software to new diff algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research in fundamental physics with the free neutron is one of the key tools for testing the Standard Model at low energies. Most prominent goals in this field are the search for a neutron electric dipole moment (EDM) and the measurement of the neutron lifetime. Significant improvements of the experimental performance using ultracold neutrons (UCN) require reduction of both systematic and statistical errors.rnThe development and construction of new UCN sources based on the superthermal concept is therefore an important step for the success of future fundamental physics with ultracold neutrons. rnSignificant enhancement of today available UCN densities strongly correlates with an efficient use of an UCN converter material. The UCN converter here is to be understood as a medium which reduces the velocity of cold neutrons (CN, velocity of about 600 m/s) to the velocity of UCN (velocity of about 6 m/s).rnSeveral big research centers around the world are presently planning or constructing new superthermal UCN sources, which are mainly based on the use of either solid deuterium or superfluid helium as UCN converter.rnThanks to the idea of Yu.Pokotilovsky, there exists the opportunity to build competitive UCN sources also at small research reactors of the TRIGA type. Of course these smaller facilities don't promise high UCN densities of several 1000 UCN/cm³, but they are able to provide densities around 100 UCN/cm³ for experiments.rnIn the context of this thesis, it was possible to demonstrate succesfully the feasibility of a superthermal UCN source at the tangential beamport C of the research reactor TRIGA Mainz. Based on a prototype for the future UCN source at the Forschungs-Neutronenquelle Heinz Maier-Leibnitz (FRMII) in munich, which was planned and built in collaboration with the Technical University of Munich, further investigations and improvements were done and are presented in this thesis. rnIn parallel, a second UCN source for the radial beamport D was designed and built. The comissioning of this new source is foreseen in spring 2010.rnAt beamport D with its higher thermal neutron flux, it should be possible to increase the available UCN densities of 4 UCN/cm³ by minimum one order of magnitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dextran-based polymers are versatile hydrophilic materials, which can provide functionalized surfaces in various areas including biological and medical applications. Functional, responsive, dextran based hydrogels are crosslinked, dextran based polymers allowing the modulation of response towards external stimuli. The controlled modulation of hydrogel properties towards specific applications and the detailed characterization of the optical, mechanical, and chemical properties are of strong interest in science and further applications. Especially, the structural characteristics of swollen hydrogel matrices and the characterization of their variations upon environmental changes are challenging. Depending on their properties hydrogels are applied as actuators, biosensors, in drug delivery, tissue engineering, or for medical coatings. However, the field of possible applications still shows potential to be expanded. rnSurface attached hydrogel films with a thickness of several micrometers can serve as waveguiding matrix for leaky optical waveguide modes. On the basis of highly swelling and waveguiding dextran based hydrogel films an optical biosensor concept was developed. The synthesis of a dextran based hydrogel matrix, its functionalization to modulate its response towards external stimuli, and the characterization of the swollen hydrogel films were main interests within this biosensor project. A second focus was the optimization of the hydrogel characteristics for cell growth with the aim of creating scaffolds for bone regeneration. Matrix modification towards successful cell growth experiments with endothelial cells and osteoblasts was achieved.rnA photo crosslinkable, carboxymethylated dextran based hydrogel (PCMD) was synthesized and characterized in terms of swelling behaviour and structural properties. Further functionalization was carried out before and after crosslinking. This functionalization aimed towards external manipulation of the swelling degree and the charge of the hydrogel matrix important for biosensor experiments as well as for cell adhesion. The modulation of functionalized PCMD hydrogel responses to pH, ion concentration, electrochemical switching, or a magnetic force was investigated. rnThe PCMD hydrogel films were optically characterized by combining surface plasmon resonance (SPR) and optical waveguide mode spectroscopy (OWS). This technique allows a detailed analysis of the refractive index profile perpendicular to the substrate surface by applying the Wentzel Kramers Brillouin (WKB) approximation. rnIn order to perform biosensor experiments, analyte capturing units such as proteins or antibodies were covalently coupled to the crosslinked hydrogel backbone by applying active ester chemistry. Consequently, target analytes could be located inside the waveguiding matrix. By using labeled analytes, fluorescence enhancement was achieved by fluorescence excitation with the electromagnetic field in the center of the optical waveguide modes. The fluorescence excited by the evanescent electromagnetic field of the surface plasmon was 2 3 orders of magnitude lower. Furthermore, the signal to noise ratio was improved by the fluorescence excitation with leaky optical waveguide modes.rnThe applicability of the PCMD hydrogel sensor matrix for clinically relevant samples was proofed in a cooperation project for the detection of PSA in serum with long range surface plasmon spectroscopy (LRSP) and fluorescence excitation by LRSP (LR SPFS). rn

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La recente Direttiva 31/2010 dell’Unione Europea impone agli stati membri di riorganizzare il quadro legislativo nazionale in materia di prestazione energetica degli edifici, affinchè tutte le nuove costruzioni presentino dal 1° gennaio 2021 un bilancio energetico tendente allo zero; termine peraltro anticipato al 1° gennaio 2019 per gli edifici pubblici. La concezione di edifici a energia “quasi” zero (nZEB) parte dal presupposto di un involucro energeticamente di standard passivo per arrivare a compensare, attraverso la produzione preferibilmente in sito di energia da fonti rinnovabili, gli esigui consumi richiesti su base annuale. In quest’ottica la riconsiderazione delle potenzialità dell’architettura solare individua degli strumenti concreti e delle valide metodologie per supportare la progettazione di involucri sempre più performanti che sfruttino pienamente una risorsa inesauribile, diffusa e alla portata di tutti come quella solare. Tutto ciò in considerazione anche della non più procrastinabile necessità di ridurre il carico energetico imputabile agli edifici, responsabili come noto di oltre il 40% dei consumi mondiali e del 24% delle emissioni di gas climalteranti. Secondo queste premesse la ricerca pone come centrale il tema dell’integrazione dei sistemi di guadagno termico, cosiddetti passivi, e di produzione energetica, cosiddetti attivi, da fonte solare nell’involucro architettonico. Il percorso sia analitico che operativo effettuato si è posto la finalità di fornire degli strumenti metodologici e pratici al progetto dell’architettura, bisognoso di un nuovo approccio integrato mirato al raggiungimento degli obiettivi di risparmio energetico. Attraverso una ricognizione generale del concetto di architettura solare e dei presupposti teorici e terminologici che stanno alla base della stessa, la ricerca ha prefigurato tre tipologie di esito finale: una codificazione delle morfologie ricorrenti nelle realizzazioni solari, un’analisi comparata del rendimento solare nelle principali aggregazioni tipologiche edilizie e una parte importante di verifica progettuale dove sono stati applicati gli assunti delle categorie precedenti