922 resultados para Assignments for benefit of creditors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Cardiovascular diseases (CVD) are the leading cause of morbidity and mortality worldwide. CVD mainly comprise of coronary heart disease and stroke and were ranked first and fourth respectively amongst leading causes of death in the United States. Influenza (flu) causes annual outbreaks and pandemics and is increasingly recognized as an important trigger for acute coronary syndromes and stroke. Influenza vaccination is an inexpensive and effective strategy for prevention of influenza related complications in high risk individuals. Though it is recommended for all CVD patients, Influenza vaccine is still used at suboptimal levels in these patients owing to prevailing controversy related to its effectiveness in preventing CVD. This review was undertaken to critically assess the effectiveness of influenza vaccination as a primary or secondary prevention method for CVD. ^ Methods: A systematic review was conducted using electronic databases OVID MEDLINE, PUBMED (National Library of Medicine), EMBASE, GOOGLE SCHOLAR and TRIP (Turning Research into Practice). The study search was limited to peer-reviewed articles published in English language from January 1970 through May 2012. The case control studies, cohort studies and randomized controlled trials related to influenza vaccination and CVD, with data on at least one of the outcomes were identified. In the review, only population-based epidemiologic studies in all ethnic groups and of either sex and with age limitation of 30 yrs or above, with clinical CVD outcomes of interest were included. ^ Results: Of the 16 studies (8 case control studies, 6 cohort studies and 2 randomized controlled trials) that met the inclusion criteria, 14 studies reported that there was a significant benefit in u influenza vaccination as primary or secondary prevention method for preventing new cardiovascular events. In contrary to the above findings, two studies mentioned that there was no significant benefit of vaccination in CVD prevention. ^ Conclusion: The available body of evidence in the review elucidates that vaccination against influenza is associated with reduction in the risk of new CVD events, hospitalization for coronary heart disease and stroke and as well as the risk of death. The study findings disclose that the influenza vaccination is very effective in CVD prevention and should be encouraged for the high risk population. However, larger and more future studies like randomized control trials are needed to further evaluate and confirm these findings. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lung cancer is the leading cause of cancer death in both men and women in the United States and worldwide. Despite improvement in treatment strategies, the 5-year survival rate of lung cancer patients remains low. Thus, effective chemoprevention and treatment approaches are sorely needed. Mutations and activation of KRAS occur frequently in tobacco users and the early stage of development of non-small cell lung cancers (NSCLC). So they are thought to be the primary driver for lung carcinogenesis. My work showed that KRAS mutations and activations modulated the expression of TNF-related apoptosis-inducing ligand (TRAIL) receptors by up-regulating death receptors and down-regulating decoy receptors. In addition, we showed that KRAS suppresses cellular FADD-like IL-1β-converting enzyme (FLICE)-like inhibitory protein (c-FLIP) expression through activation of ERK/MAPK-mediated activation of c-MYC which means the mutant KRAS cells could be specifically targeted via TRAIL induced apoptosis. The expression level of Inhibitors of Apoptosis Proteins (IAPs) in mutant KRAS cells is usually high which could be overcome by the second mitochondria-derived activator of caspases (Smac) mimetic. So the combination of TRAIL and Smac mimetic induced the synthetic lethal reaction specifically in the mutant-KRAS cells but not in normal lung cells and wild-type KRAS lung cancer cells. Therefore, a synthetic lethal interaction among TRAIL, Smac mimetic and KRAS mutations could be used as an approach for chemoprevention and treatment of NSCLC with KRAS mutations. Further data in animal experiments showed that short-term, intermittent treatment with TRAIL and Smac mimetic induced apoptosis in mutant KRAS cells and reduced tumor burden in a KRAS-induced pre-malignancy model and mutant KRAS NSCLC xenograft models. These results show the great potential benefit of a selective therapeutic approach for the chemoprevention and treatment of NSCLC with KRAS mutations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Of the large clinical trials evaluating screening mammography efficacy, none included women ages 75 and older. Recommendations on an upper age limit at which to discontinue screening are based on indirect evidence and are not consistent. Screening mammography is evaluated using observational data from the SEER-Medicare linked database. Measuring the benefit of screening mammography is difficult due to the impact of lead-time bias, length bias and over-detection. The underlying conceptual model divides the disease into two stages: pre-clinical (T0) and symptomatic (T1) breast cancer. Treating the time in these phases as a pair of dependent bivariate observations, (t0,t1), estimates are derived to describe the distribution of this random vector. To quantify the effect of screening mammography, statistical inference is made about the mammography parameters that correspond to the marginal distribution of the symptomatic phase duration (T1). This shows the hazard ratio of death from breast cancer comparing women with screen-detected tumors to those detected at their symptom onset is 0.36 (0.30, 0.42), indicating a benefit among the screen-detected cases. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Postpartum hemorrhage (PPH) remains a major killer of women worldwide. Standard uterotonic treatments used to control postpartum bleeding do not always work and are not always available. Misoprostol's potential as a treatment option for PPH is increasingly known, but its use remains ad hoc and available evidence does not support the safety or efficacy of one particular regimen. This study aimed to determine the adjunct benefit of misoprostol when combined with standard oxytocics for PPH treatment. Methods: A randomized controlled trial was conducted in four Karachi hospitals from December 2005 – April 2007 to assess the benefit of a 600 mcg dose of misoprostol given sublingually in addition to standard oxytocics for postpartum hemorrhage treatment. Consenting women had their blood loss measured after normal vaginal delivery and were enrolled in the study after losing more than 500 ml of blood. Women were randomly assigned to receive either 600 mcg sublingual misoprostol or matching placebo in addition to standard PPH treatment with injectable oxytocics. Both women and providers were blinded to the treatment assignment. Blood loss was collected until active bleeding stopped and for a minimum of one hour after PPH diagnosis. Total blood loss, hemoglobin measures, and treatment outcomes were recorded for all participants. Results: Due to a much lower rate of PPH than expected (1.2%), only sixty-one patients were diagnosed and treated for their PPH in this study, and we were therefore unable to measure statistical significance in any of the primary endpoints. The addition of 600 mcg sublingual misoprostol to standard PPH treatments does, however, suggest a trend in reduced postpartum blood loss, a smaller drop in postpartum hemoglobin, and need for fewer additional interventions. Women who bled less overall had a significantly smaller drop in hemoglobin and received fewer additional interventions. There were no hysterectomies or maternal deaths among study participants. The rate of transient shivering and fever was significantly higher among women receiving misoprostol Conclusion: A 600 mcg dose of misoprostol given sublingually shows promise as an adjunct treatment for PPH and its use should continue to be explored for its life-saving potential in the care of women experiencing PPH. Trial Registration: Clinical trials.gov, Registry No. NCT00116480

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The filling up of the lake which existed in the basin of the Trentelmoor (40 km E of Hannover, Germany) - in Preboreal times was finished 2000 years ago. Since then fen vegetation has covered the former lake's surface. The postglacial development of the vegetation follows the pattern which is typical of Central Europe. However, due to the poorness of the soils around the Trentelmoor, the frequencies of some tree species differ. Beech for example never reached - for the benefit of oak - that importance which this tree species usually gains on better soils. Human impact becomes recognisable in the upper Neolithic for the first time. The area has been settled continuously, but with changing intensities, throughout the last 3000 years. When the manuscript of this paper went to press the results of two radiocarbon age determinations only were completed. An additional three determinations were completed somewhat later. See the accompanying table for results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Rieseberger Moor is a fen, 145 hectares in size, situated about 20 km east of Brunswick (Braunschweig), Lower Saxony, Germany. Peat was dug in the fen - with changing intensity - since the mid-18th century until around AD 1955. According to Schneekloth & Schneider (1971) the remaining peat (fen and wood peat) is predominantly 1.5 to 2 m thick (maximum 2.7 m). Part of the fen - now a nature reserve (NSG BR 005) - is wooded (Betula, Salix, Alnus). For more information on the Rieseberger Moor see http://de.wikipedia.org/wiki/Rieseberger_Moor. Willi Selle was the first to publish pollen diagrams from this site (Selle 1935, profiles Rieseberger Torfmoor I and II). This report deals with a 2.2 m long profile from the wooded south-eastern part of the fen consisting of strongly decomposed fen peat taken A.D. 1965 and studied by pollen analysis in the same year. The peat below 1.45 m contained silt and clay, samples 1.48 and 1.58 m even fine sand. These samples had to be treated with HF (hydrofluoric acid) in addition to the treatment with hot caustic potash solution. The coring ended in sandy material. The new pollen data reflect the early part of the known postglacial development of the vegetation of this area: the change from a birch dominated forest to a pine forest and the later spreading of Corylus and of the thermophilous deciduous tree genera Quercus, Ulmus, Tilia and Fraxinus followed by the expansion of Alnus. The new data are in agreement with Selle's results, except for Alnus, which in Selle's pollen diagram II shows high values (up to 42% of the arboreal pollen sum) even in samples deposited before Corylus and Quercus started to spread. On contrary the new pollen diagram shows that alder pollen - although present in all samples - is frequent in the three youngest pollen spectra only. A period with dominating Alnus as seen in the uppermost part of Selle's pollen diagrams is missing. The latter is most likely the result of peat cutting at the later coring site, whereas the early, unusually high alder values of Selle's pollen study are probably caused by contamination of the pollen samples with younger peat. Selle took peat samples usually with a "Torfbohrer" (= Hiller sampler). This side-filling type of sampler with an inner chamber and an outer loose jacket offers - if not handled with appropriate care - ample opportunities to contaminate older peat with carried off younger material. Pollen grains of Fagus (2 % of the arboreal pollen sum) were found in two samples only, namely in the uppermost samples of the new profile (0.18 m) and of Selle's profile I (0.25 m). If this pollen is autochthonous, with other words: if this surface-near peat was not disturbed by human activities, the Fagus pollen indicates an Early Subboreal age of this part of the profile. The accumulation of the Rieseberg peat started during the Preboreal. Increased values of Corylus, Quercus and Ulmus indicate that sample 0.78 m of the new profile is the oldest Boreal sample. The high Alnus values prove the Atlantic age of the younger peat. Whether Early Subboreal peat exists at the site is questionable, but evidently none of the three profiles reaches to Late Subboreal time, when Fagus spread in the region. Did peat-growth end during the Subboreal? Did younger peat exist, but got lost by peat cutting or has younger peat simply not yet been found in the Rieseberg fen? These questions cannot be answered with this study. The temporary decline of the curve of Pinus for the benefit of Betula during the Preboreal, unusual for this period, is contemporaneous with the deposition of sand (Rieseberger Moor II, 1.33 - 1,41 m; samples 1.48 and 1.58 m of the new profile) and must be considered a local phenomenon. Literature: Schneekloth, Heinrich & Schneider, Siegfried (1971). Die Moore in Niedersachsen. 2. Teil. Bereich des Blattes Braunschweig der Geologischen Karte der Bundesrepublik Deutschland (1:200000). - Schriften der wirtschaftswissenschaftlichen Gesellschaft zum Studium Niedersachsens e.V. Reihe A I., Band 96, Heft 2, 83 Seiten, Göttingen. Selle, Willi (1935) Das Torfmoor bei Rieseberg. - Jahresbericht des Vereins für Naturwissenschaft zu Braunschweig, 23, 46-58, Braunschweig.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

During the transition period from a planned economy to a market economy in 1990s of China, there was a considerable accrual of deferred payment, and default due to inferior enforcement institutions. This is a very common phenomenon in the transition economies at that time. Interviews with home electronics appliance firms revealed that firms coped with this problem by adjusting their sales mechanisms (found four types), and the benefit of institutions was limited. A theoretical analysis claim that spot and integration are inferior to contracts, a contract with a rebate on volume and prepayment against an exclusive agent can realize the lowest cost and price. The empirical part showed that mechanisms converged into a mechanism with the rebate on volume an against exclusive agent and its price level is the lowest. The competition is the driving force of the convergence of mechanisms and improvement risk management capacity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are large numbers of business communities in India which neither had any formal education nor they took any professional training but still they contribute in successful business formation. Their presence can be felt in all areas of business. Still there is a big professional gap between the educational institutes, specially the B-Schools and this independent business community. With the help of this paper an effort is made to develop a Two-Way learning relationship for the mutual benefit of both entities. It will also highlight the role of an educational institute beyond academics for the well being of society. This may lead to derive and develop the exchange of innovative business ideas and framing the suitable policies for long term sustainability in today´s competitive arena. The study conducted by researcher with a sample size of 100 which includes a mix of well known academic professionals, MBA students and non academic business professionals has revealed that there is a need of an exchange program for the mutual benefits. There exists a big professional gap in this area which can be filled with the active and effective initiative by management institutes. An effort is made in this paper to highlight this gap and to suggest some framework to bridge the gap

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Workflow reuse is a major benefit of workflow systems and shared workflow repositories, but there are barely any studies that quantify the degree of reuse of workflows or the practical barriers that may stand in the way of successful reuse. In our own work, we hypothesize that defining workflow fragments improves reuse, since end-to-end workflows may be very specific and only partially reusable by others. This paper reports on a study of the current use of workflows and workflow fragments in labs that use the LONI Pipeline, a popular workflow system used mainly for neuroimaging research that enables users to define and reuse workflow fragments. We present an overview of the benefits of workflows and workflow fragments reported by users in informal discussions. We also report on a survey of researchers in a lab that has the LONI Pipeline installed, asking them about their experiences with reuse of workflow fragments and the actual benefits they perceive. This leads to quantifiable indicators of the reuse of workflows and workflow fragments in practice. Finally, we discuss barriers to further adoption of workflow fragments and workflow reuse that motivate further work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work we review some earlier distributed algorithms developed by the authors and collaborators, which are based on two different approaches, namely, distributed moment estimation and distributed stochastic approximations. We show applications of these algorithms on image compression, linear classification and stochastic optimal control. In all cases, the benefit of cooperation is clear: even when the nodes have access to small portions of the data, by exchanging their estimates, they achieve the same performance as that of a centralized architecture, which would gather all the data from all the nodes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El uso de aritmética de punto fijo es una opción de diseño muy extendida en sistemas con fuertes restricciones de área, consumo o rendimiento. Para producir implementaciones donde los costes se minimicen sin impactar negativamente en la precisión de los resultados debemos llevar a cabo una asignación cuidadosa de anchuras de palabra. Encontrar la combinación óptima de anchuras de palabra en coma fija para un sistema dado es un problema combinatorio NP-hard al que los diseñadores dedican entre el 25 y el 50 % del ciclo de diseño. Las plataformas hardware reconfigurables, como son las FPGAs, también se benefician de las ventajas que ofrece la aritmética de coma fija, ya que éstas compensan las frecuencias de reloj más bajas y el uso más ineficiente del hardware que hacen estas plataformas respecto a los ASICs. A medida que las FPGAs se popularizan para su uso en computación científica los diseños aumentan de tamaño y complejidad hasta llegar al punto en que no pueden ser manejados eficientemente por las técnicas actuales de modelado de señal y ruido de cuantificación y de optimización de anchura de palabra. En esta Tesis Doctoral exploramos distintos aspectos del problema de la cuantificación y presentamos nuevas metodologías para cada uno de ellos: Las técnicas basadas en extensiones de intervalos han permitido obtener modelos de propagación de señal y ruido de cuantificación muy precisos en sistemas con operaciones no lineales. Nosotros llevamos esta aproximación un paso más allá introduciendo elementos de Multi-Element Generalized Polynomial Chaos (ME-gPC) y combinándolos con una técnica moderna basada en Modified Affine Arithmetic (MAA) estadístico para así modelar sistemas que contienen estructuras de control de flujo. Nuestra metodología genera los distintos caminos de ejecución automáticamente, determina las regiones del dominio de entrada que ejercitarán cada uno de ellos y extrae los momentos estadísticos del sistema a partir de dichas soluciones parciales. Utilizamos esta técnica para estimar tanto el rango dinámico como el ruido de redondeo en sistemas con las ya mencionadas estructuras de control de flujo y mostramos la precisión de nuestra aproximación, que en determinados casos de uso con operadores no lineales llega a tener tan solo una desviación del 0.04% con respecto a los valores de referencia obtenidos mediante simulación. Un inconveniente conocido de las técnicas basadas en extensiones de intervalos es la explosión combinacional de términos a medida que el tamaño de los sistemas a estudiar crece, lo cual conlleva problemas de escalabilidad. Para afrontar este problema presen tamos una técnica de inyección de ruidos agrupados que hace grupos con las señales del sistema, introduce las fuentes de ruido para cada uno de los grupos por separado y finalmente combina los resultados de cada uno de ellos. De esta forma, el número de fuentes de ruido queda controlado en cada momento y, debido a ello, la explosión combinatoria se minimiza. También presentamos un algoritmo de particionado multi-vía destinado a minimizar la desviación de los resultados a causa de la pérdida de correlación entre términos de ruido con el objetivo de mantener los resultados tan precisos como sea posible. La presente Tesis Doctoral también aborda el desarrollo de metodologías de optimización de anchura de palabra basadas en simulaciones de Monte-Cario que se ejecuten en tiempos razonables. Para ello presentamos dos nuevas técnicas que exploran la reducción del tiempo de ejecución desde distintos ángulos: En primer lugar, el método interpolativo aplica un interpolador sencillo pero preciso para estimar la sensibilidad de cada señal, y que es usado después durante la etapa de optimización. En segundo lugar, el método incremental gira en torno al hecho de que, aunque es estrictamente necesario mantener un intervalo de confianza dado para los resultados finales de nuestra búsqueda, podemos emplear niveles de confianza más relajados, lo cual deriva en un menor número de pruebas por simulación, en las etapas iniciales de la búsqueda, cuando todavía estamos lejos de las soluciones optimizadas. Mediante estas dos aproximaciones demostramos que podemos acelerar el tiempo de ejecución de los algoritmos clásicos de búsqueda voraz en factores de hasta x240 para problemas de tamaño pequeño/mediano. Finalmente, este libro presenta HOPLITE, una infraestructura de cuantificación automatizada, flexible y modular que incluye la implementación de las técnicas anteriores y se proporciona de forma pública. Su objetivo es ofrecer a desabolladores e investigadores un entorno común para prototipar y verificar nuevas metodologías de cuantificación de forma sencilla. Describimos el flujo de trabajo, justificamos las decisiones de diseño tomadas, explicamos su API pública y hacemos una demostración paso a paso de su funcionamiento. Además mostramos, a través de un ejemplo sencillo, la forma en que conectar nuevas extensiones a la herramienta con las interfaces ya existentes para poder así expandir y mejorar las capacidades de HOPLITE. ABSTRACT Using fixed-point arithmetic is one of the most common design choices for systems where area, power or throughput are heavily constrained. In order to produce implementations where the cost is minimized without negatively impacting the accuracy of the results, a careful assignment of word-lengths is required. The problem of finding the optimal combination of fixed-point word-lengths for a given system is a combinatorial NP-hard problem to which developers devote between 25 and 50% of the design-cycle time. Reconfigurable hardware platforms such as FPGAs also benefit of the advantages of fixed-point arithmetic, as it compensates for the slower clock frequencies and less efficient area utilization of the hardware platform with respect to ASICs. As FPGAs become commonly used for scientific computation, designs constantly grow larger and more complex, up to the point where they cannot be handled efficiently by current signal and quantization noise modelling and word-length optimization methodologies. In this Ph.D. Thesis we explore different aspects of the quantization problem and we present new methodologies for each of them: The techniques based on extensions of intervals have allowed to obtain accurate models of the signal and quantization noise propagation in systems with non-linear operations. We take this approach a step further by introducing elements of MultiElement Generalized Polynomial Chaos (ME-gPC) and combining them with an stateof- the-art Statistical Modified Affine Arithmetic (MAA) based methodology in order to model systems that contain control-flow structures. Our methodology produces the different execution paths automatically, determines the regions of the input domain that will exercise them, and extracts the system statistical moments from the partial results. We use this technique to estimate both the dynamic range and the round-off noise in systems with the aforementioned control-flow structures. We show the good accuracy of our approach, which in some case studies with non-linear operators shows a 0.04 % deviation respect to the simulation-based reference values. A known drawback of the techniques based on extensions of intervals is the combinatorial explosion of terms as the size of the targeted systems grows, which leads to scalability problems. To address this issue we present a clustered noise injection technique that groups the signals in the system, introduces the noise terms in each group independently and then combines the results at the end. In this way, the number of noise sources in the system at a given time is controlled and, because of this, the combinato rial explosion is minimized. We also present a multi-way partitioning algorithm aimed at minimizing the deviation of the results due to the loss of correlation between noise terms, in order to keep the results as accurate as possible. This Ph.D. Thesis also covers the development of methodologies for word-length optimization based on Monte-Carlo simulations in reasonable times. We do so by presenting two novel techniques that explore the reduction of the execution times approaching the problem in two different ways: First, the interpolative method applies a simple but precise interpolator to estimate the sensitivity of each signal, which is later used to guide the optimization effort. Second, the incremental method revolves on the fact that, although we strictly need to guarantee a certain confidence level in the simulations for the final results of the optimization process, we can do it with more relaxed levels, which in turn implies using a considerably smaller amount of samples, in the initial stages of the process, when we are still far from the optimized solution. Through these two approaches we demonstrate that the execution time of classical greedy techniques can be accelerated by factors of up to ×240 for small/medium sized problems. Finally, this book introduces HOPLITE, an automated, flexible and modular framework for quantization that includes the implementation of the previous techniques and is provided for public access. The aim is to offer a common ground for developers and researches for prototyping and verifying new techniques for system modelling and word-length optimization easily. We describe its work flow, justifying the taken design decisions, explain its public API and we do a step-by-step demonstration of its execution. We also show, through an example, the way new extensions to the flow should be connected to the existing interfaces in order to expand and improve the capabilities of HOPLITE.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We optically imaged a visual masking illusion in primary visual cortex (area V-1) of rhesus monkeys to ask whether activity in the early visual system more closely reflects the physical stimulus or the generated percept. Visual illusions can be a powerful way to address this question because they have the benefit of dissociating the stimulus from perception. We used an illusion in which a flickering target (a bar oriented in visual space) is rendered invisible by two counter-phase flickering bars, called masks, which flank and abut the target. The target and masks, when shown separately, each generated correlated activity on the surface of the cortex. During the illusory condition, however, optical signals generated in the cortex by the target disappeared although the image of the masks persisted. The optical image thus was correlated with perception but not with the physical stimulus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The availability of gene-targeted mice deficient in the urokinase-type plasminogen activator (uPA), urokinase receptor (uPAR), tissue-type plasminogen activator (tPA), and plasminogen permits a critical, genetic-based analysis of the physiological and pathological roles of the two mammalian plasminogen activators. We report a comparative study of animals with individual and combined deficits in uPAR and tPA and show that these proteins are complementary fibrinolytic factors in mice. Sinusoidal fibrin deposits are found within the livers of nearly all adult mice examined with a dual deficiency in uPAR and tPA, whereas fibrin deposits are never found in livers collected from animals lacking uPAR and rarely detected in animals lacking tPA alone. This is the first demonstration that uPAR has a physiological role in fibrinolysis. However, uPAR-/-/tPA-/- mice do not develop the pervasive, multi-organ fibrin deposits, severe tissue damage, reduced fertility, and high morbidity and mortality observed in mice with a combined deficiency in tPA and the uPAR ligand, uPA. Furthermore, uPAR-/-/tPA-/- mice do not exhibit the profound impairment in wound repair seen in uPA-/-/tPA-/- mice when they are challenged with a full-thickness skin incision. These results indicate that plasminogen activation focused at the cell surface by uPAR is important in fibrin surveillance in the liver, but that uPA supplies sufficient fibrinolytic potential to clear fibrin deposits from most tissues and support wound healing without the benefit of either uPAR or tPA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The lack of efficient mechanisms for stable genetic transformation of medically important insects, such as anopheline mosquitoes, is the single most important impediment to progress in identifying novel control strategies. Currently available techniques for foreign gene expression in insect cells in culture lack the benefit of stable inheritance conferred by integration. To overcome this problem, a new class of pantropic retroviral vectors has been developed in which the amphotropic envelope is completely replaced by the G glycoprotein of vesicular stomatitis virus. The broadened host cell range of these particles allowed successful entry, integration, and expression of heterologous genes in cultured cells of Anopheles gambiae, the principle mosquito vector responsible for the transmission of over 100 million cases of malaria each year. Mosquito cells in culture infected with a pantropic vector expressing hygromycin phosphotransferase from the Drosophila hsp70 promoter were resistant to the antibiotic hygromycin B. Integrated provirus was detected in infected mosquito cell clones grown in selective media. Thus, pantropic retroviral vectors hold promise as a transformation system for mosquitoes in vivo.