864 resultados para Component-based systems
Resumo:
Uno de los grandes desafíos analíticos es resolver la complejidad del análisis de cantidades trazas de compuestos orgánicos debido a la baja sensibilidad analítica de las técnicas usuales que permiten una determinación específica como IR o RMN. El uso de espectrofotometría UV-Visible y espectroluminiscencia, técnicas que presentan mayor sensibilidad, se ve dificultada en muchos casos por el efecto matriz producido en el tratamiento de muestras reales y complejas o pérdida de la selectividad debido a la superposición de bandas.La interacción por formación de complejos entre determinados sustratos y receptores macrocíclicos que presentan poros o cavidades nanométricas, puede afectar las propiedades espectroscópicas de los sustratos. La respuesta de técnicas sensibles puede traducirse así en un análisis selectivo debido al reconocimiento molecular que se establece entre un dado receptor y el sustrato de interés. Por otra parte puede mejorar la sensibilidad debido a efectos de micropolaridad del medio, a efectos de restricciones de grados de libertad, por compartamentalización o protección de los estados excitados de los sustratos incluidos. El uso analítico de receptores selectivos es un área actualmente en desarrollo, que permite una rápida determinación de especies químicas, disminuyendo el efecto de interferentes, mejorando la sensibilidad y disminuyendo el tratamiento de la muestra.Se estudiarán los mecanismos involucrados en las interacciones y los factores que los modifican por técnicas espectroscópicas como UV-visible, RMN y luminiscencia. Se determinarán los parámetros analíticos por luminiscencia en los medios y condiciones en que la sensibilidad analítica muestre el mayor incremento. Se realizarán las pruebas de validación en las mejores condiciones para cada uno y mezclas de analitos relacionados en muestras reales.
Resumo:
Magdeburg, Univ., Fak. für Verfahrens- und Systemtechnik, Diss., 2014
Resumo:
Magdeburg, Univ., Fak. für Informatik, Diss., 2015
Resumo:
Magdeburg, Univ., Fak. für Elektrotechnik und Informationstechnik, Diss., 2015
Resumo:
BACKGROUND: Recommendations for statin use for primary prevention of coronary heart disease (CHD) are based on estimation of the 10- year CHD risk. We compared the 10-year CHD risk assessments and eligibility percentages for statin therapy using three scoring algorithms currently used in Europe. METHODS: We studied 5683 women and men, aged 35-75, without overt cardiovascular disease (CVD), in a population-based study in Switzerland. We compared the 10-year CHD risk using three scoring schemes, i.e., the Framingham risk score (FRS) from the U.S. National Cholesterol Education Program's Adult Treatment Panel III (ATP III), the PROCAM scoring scheme from the International Atherosclerosis Society (IAS), and the European risk SCORE for low-risk countries, without and with extrapolation to 60 years as recommended by the European Society of Cardiology guidelines (ESC). With FRS and PROCAM, high-risk was defined as a 10- year risk of fatal or non-fatal CHD>20% and a 10-year risk of fatal CVD≥5% with SCORE. We compared the proportions of high-risk participants and eligibility for statin use according to these three schemes. For each guideline, we estimated the impact of increased statin use from current partial compliance to full compliance on potential CHD deaths averted over 10 years, using a success proportion of 27% for statins. RESULTS: Participants classified at high-risk (both genders) were 5.8% according to FRS and 3.0% to the PROCAM, whereas the European risk SCORE classified 12.5% at high-risk (15.4% with extrapolation to 60 years). For the primary prevention of CHD, 18.5% of participants were eligible for statin therapy using ATP III, 16.6% using IAS, and 10.3% using ESC (13.0% with extrapolation) because ESC guidelines recommend statin therapy only in high-risk subjects. In comparison with IAS, agreement to identify eligible adults for statins was good with ATP III, but moderate with ESC. Using a population perspective, a full compliance with ATP III guidelines would reduce up to 17.9% of the 24′ 310 CHD deaths expected over 10 years in Switzerland, 17.3% with IAS and 10.8% with ESC (11.5% with extrapolation). CONCLUSIONS: Full compliance with guidelines for statin therapy would result in substantial health benefits, but proportions of high-risk adults and eligible adults for statin use varied substantially depending on the scoring systems and corresponding guidelines used for estimating CHD risk in Europe.
Resumo:
The electromagnetic radiation at a terahertz frequencies (from 0.1 THz to 10 THz) is situated in the frequency band comprised between the optical band and the radio band. The interest of the scientific community in this frequency band has grown up due to its large capabilities to develop innovative imaging systems. The terahertz waves are able to generate extremely short pulses that achieve good spatial resolution, good penetration capabilities and allow to identify microscopic structures using spectral analysis. The work carried out during the period of the grant has been based on the developement of system working at the aforementioned frequency band. The main system is based on a total power radiometer working at 0.1 THz to perform security imaging. Moreover, the development of this system has been useful to gain knowledge in the behavior of the component systems at this frequency band. Moreover, a vectorial network analyzer has been used to characterize materials and perform active raster imaging. A materials measurement system has been designed and used to measure material properties as permittivity, losses and water concentration. Finally, the design of a terahertz time-domain spectrometer (THz-TDS) system has been started. This system will allow to perform tomographic measurement with very high penetration resolutions while allowing the spectral characterization of the sample material. The application range of this kind of system is very wide: from the identification of cancerous tissues of a skin to the characterization of the thickness of a painted surface of a car.
Resumo:
Process supervision is the activity focused on monitoring the process operation in order to deduce conditions to maintain the normality including when faults are present Depending on the number/distribution/heterogeneity of variables, behaviour situations, sub-processes, etc. from processes, human operators and engineers do not easily manipulate the information. This leads to the necessity of automation of supervision activities. Nevertheless, the difficulty to deal with the information complicates the design and development of software applications. We present an approach called "integrated supervision systems". It proposes multiple supervisors coordination to supervise multiple sub-processes whose interactions permit one to supervise the global process
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
Diagnosis of several neurological disorders is based on the detection of typical pathological patterns in the electroencephalogram (EEG). This is a time-consuming task requiring significant training and experience. Automatic detection of these EEG patterns would greatly assist in quantitative analysis and interpretation. We present a method, which allows automatic detection of epileptiform events and discrimination of them from eye blinks, and is based on features derived using a novel application of independent component analysis. The algorithm was trained and cross validated using seven EEGs with epileptiform activity. For epileptiform events with compensation for eyeblinks, the sensitivity was 65 +/- 22% at a specificity of 86 +/- 7% (mean +/- SD). With feature extraction by PCA or classification of raw data, specificity reduced to 76 and 74%, respectively, for the same sensitivity. On exactly the same data, the commercially available software Reveal had a maximum sensitivity of 30% and concurrent specificity of 77%. Our algorithm performed well at detecting epileptiform events in this preliminary test and offers a flexible tool that is intended to be generalized to the simultaneous classification of many waveforms in the EEG.
Resumo:
The cDNA encoding the NH2-terminal 589 amino acids of the extracellular domain of the human polymeric immunoglobulin receptor was inserted into transfer vectors to generate recombinant baculo- and vaccinia viruses. Following infection of insect and mammalian cells, respectively, the resulting truncated protein corresponding to human secretory component (hSC) was secreted with high efficiency into serum-free culture medium. The Sf9 insect cell/baculovirus system yielded as much as 50 mg of hSC/liter of culture, while the mammalian cells/vaccinia virus system produced up to 10 mg of protein/liter. The M(r) of recombinant hSC varied depending on the cell line in which it was expressed (70,000 in Sf9 cells and 85-95,000 in CV-1, TK- 143B and HeLa). These variations in M(r) resulted from different glycosylation patterns, as evidenced by endoglycosidase digestion. Efficient single-step purification of the recombinant protein was achieved either by concanavalin A affinity chromatography or by Ni(2+)-chelate affinity chromatography, when a 6xHis tag was engineered to the carboxyl terminus of hSC. Recombinant hSC retained the capacity to specifically reassociate with dimeric IgA purified from hybridoma cells.
Resumo:
It is well known that multiple-input multiple-output (MIMO) techniques can bring numerous benefits, such as higher spectral efficiency, to point-to-point wireless links. More recently, there has been interest in extending MIMO concepts tomultiuser wireless systems. Our focus in this paper is on network MIMO, a family of techniques whereby each end user in a wireless access network is served through several access points within its range of influence. By tightly coordinating the transmission and reception of signals at multiple access points, network MIMO can transcend the limits on spectral efficiency imposed by cochannel interference. Taking prior information-theoretic analyses of networkMIMO to the next level, we quantify the spectral efficiency gains obtainable under realistic propagation and operational conditions in a typical indoor deployment. Our study relies on detailed simulations and, for specificity, is conducted largely within the physical-layer framework of the IEEE 802.16e Mobile WiMAX system. Furthermore,to facilitate the coordination between access points, we assume that a high-capacity local area network, such as Gigabit Ethernet,connects all the access points. Our results confirm that network MIMO stands to provide a multiple-fold increase in spectralefficiency under these conditions.
Resumo:
BACKGROUND: Allostatic load reflects cumulative exposure to stressors throughout lifetime and has been associated with several adverse health outcomes. It is hypothesized that people with low socioeconomic status (SES) are exposed to higher chronic stress and have therefore greater levels of allostatic load. OBJECTIVE: To assess the association of receiving social transfers and low education with allostatic load. METHODS: We included 3589 participants (1812 women) aged over 35years and under retirement age from the population-based CoLaus study (Lausanne, Switzerland, 2003-2006). We computed an allostatic load index aggregating cardiovascular, metabolic, dyslipidemic and inflammatory markers. A novel index additionally including markers of oxidative stress was also examined. RESULTS: Men with low vs. high SES were more likely to have higher levels of allostatic load (odds ratio (OR)=1.93/2.34 for social transfers/education, 95%CI from 1.45 to 4.17). The same patterns were observed among women. Associations persisted after controlling for health behaviors and marital status. CONCLUSIONS: Low education and receiving social transfers independently and cumulatively predict high allostatic load and dysregulation of several homeostatic systems in a Swiss population-based study. Participants with low SES are at higher risk of oxidative stress, which may justify its inclusion as a separate component of allostatic load.
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.