955 resultados para Exponential growth


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This review focuses on the use of particulate delivery systems for the purposes of immunization. This includes poly(lactide-co-glycolide) (PLGA), ISCOMs, liposomes, niosomes, virosomes, chitosan, and other biodegradable polymers. These systems are evaluated in terms of their use as carriers for protein subunit and DNA vaccines. There is an extensive focus on recent literature, the understanding of biological interactions, and relation of this to our present understanding of immunological mechanisms of action. In addition, there is consideration of formulation techniques including emulsification, solvent diffusion, DNA complexation, and entrapment. The diversity of formulation strategies presented is a testament to the exponential growth and interest in the area of vaccine delivery systems. A case study for the application of particulate vaccine carriers is assessed in terms of vaccine development and recent insights into the possible design and application of vaccines against two of the most important pathogens that threaten mankind and for which there is a significant need: Mycobacterium tuberculosis and human immunodeficiency virus. This review addresses the rationale for the use of particulate delivery systems in vaccine design in the context of the diversity of carriers for DNA- and protein-based vaccines and their potential for application in terms of the critical need for effective vaccines. © 2005 by Begell House, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article is linked to my major study on the Poetik des Extremen by classifying the monstrous works of Marianne Fritz among a genealogy of extremist writing in German-speaking literature. Her literary project Festung, which represents in all likelihood the most extensive ‘novel’ in Western literary history, is first analysed by looking at the exponential growth of its components from a paperback of 108 pages to the not yet completed novel Naturgemäß, which will most probably comprise 15 volumes, mostly of A4 size and a length that should be equivalent to over 20,000 standard pages. In parallel to the quantitative explosion of form, the article also explores the transgression of traditional narration and Fritz’s typographical innovations of text presentation. Using reproductions of the late facsimile volumes, an exemplary ‘close reading’ of one page from Naturgemäß II is undertaken to demonstrate the enormous density of Festung. Finally, the article seeks to differentiate Fritz’s opus magnum from other out-sized works of literature by focussing on the specific interconnection between the quantitative and stylistic explosion of the form of the novel, which makes it incomparable to the major works of writers such as Robert Musil or Arno Schmidt.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The neural bases of altered consciousness in patients with epilepsy during seizures and at rest have raised significant interest in the last decade. This exponential growth has been supported by the parallel development of techniques and methods to investigate brain function noninvasively with unprecedented spatial and temporal resolution. In this article, we review the contribution of magnetoencephalography to deconvolve the bioelectrical changes associated with impaired consciousness during seizures. We use data collected from a patient with refractory absence seizures to discuss how spike-wave discharges are associated with perturbations in optimal connectivity within and between brain regions and discuss indirect evidence to suggest that this phenomenon might explain the cognitive deficits experienced during prolonged 3/s spike-wave discharges. © 2013 Elsevier Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We study how the spatial distribution of inertial particles evolves with time in a random flow. We describe an explosive appearance of caustics and show how they influence an exponential growth of clusters due to smooth parts of the flow, leading in particular to an exponential growth of the average distance between particles. We demonstrate how caustics restrict applicability of Lagrangian description to inertial particles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

∗The first author was partially supported by MURST of Italy; the second author was par- tially supported by RFFI grant 99-01-00233.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Myocardial infarction results in loss of cardiac muscle and deficiency in cardiac performance. Likewise, peripheral artery disease can result in critical limb ischemia leading to reduced mobility, non-healing ulcers, gangrene and amputation. Both of these common conditions diminish quality of life and enhance risk of mortality. Successful advances in treatment have led to more people surviving incidences of myocardial infarction or living with peripheral artery disease. However, the current treatments are inadequate in repairing ischemic tissue. Over the last 5 years, a vast number of patents have been submitted concerning the use of stem cells, which correlates with the exponential growth in stem cell publications. Exploiting stem cell therapy offers a real potential in replacing ischemic tissue with functional cells. In this paper, we review recent patents concerning stem cell therapy that have the potential to provide or potentiate novel treatment for ischemic cardiovascular disease. In addition, we evaluate the promise of the inventions by describing some clinical trials that are currently taking place, as well as considering how current research on ischemic cardiovascular disease may change the patent landscape in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a novel algorithm for medial surfaces extraction that is based on the density-corrected Hamiltonian analysis of Torsello and Hancock [1]. In order to cope with the exponential growth of the number of voxels, we compute a first coarse discretization of the mesh which is iteratively refined until a desired resolution is achieved. The refinement criterion relies on the analysis of the momentum field, where only the voxels with a suitable value of the divergence are exploded to a lower level of the hierarchy. In order to compensate for the discretization errors incurred at the coarser levels, a dilation procedure is added at the end of each iteration. Finally we design a simple alignment procedure to correct the displacement of the extracted skeleton with respect to the true underlying medial surface. We evaluate the proposed approach with an extensive series of qualitative and quantitative experiments. © 2013 Elsevier Inc. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The major barrier to practical optimization of pavement preservation programming has always been that for formulations where the identity of individual projects is preserved, the solution space grows exponentially with the problem size to an extent where it can become unmanageable by the traditional analytical optimization techniques within reasonable limit. This has been attributed to the problem of combinatorial explosion that is, exponential growth of the number of combinations. The relatively large number of constraints often presents in a real-life pavement preservation programming problems and the trade-off considerations required between preventive maintenance, rehabilitation and reconstruction, present yet another factor that contributes to the solution complexity. In this research study, a new integrated multi-year optimization procedure was developed to solve network level pavement preservation programming problems, through cost-effectiveness based evolutionary programming analysis, using the Shuffled Complex Evolution (SCE) algorithm.^ A case study problem was analyzed to illustrate the robustness and consistency of the SCE technique in solving network level pavement preservation problems. The output from this program is a list of maintenance and rehabilitation treatment (M&R) strategies for each identified segment of the network in each programming year, and the impact on the overall performance of the network, in terms of the performance levels of the recommended optimal M&R strategy. ^ The results show that the SCE is very efficient and consistent in the simultaneous consideration of the trade-off between various pavement preservation strategies, while preserving the identity of the individual network segments. The flexibility of the technique is also demonstrated, in the sense that, by suitably coding the problem parameters, it can be used to solve several forms of pavement management programming problems. It is recommended that for large networks, some sort of decomposition technique should be applied to aggregate sections, which exhibit similar performance characteristics into links, such that whatever M&R alternative is recommended for a link can be applied to all the sections connected to it. In this way the problem size, and hence the solution time, can be greatly reduced to a more manageable solution space. ^ The study concludes that the robust search characteristics of SCE are well suited for solving the combinatorial problems in long-term network level pavement M&R programming and provides a rich area for future research. ^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Internet has become an integral part of our nation's critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a 'distance metric'. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Faced with an agribusiness expansion scenario and the increase in fertilizer consumption due to the exponential growth of the population, it is necessary to make better use of existing reserves, by obtaining products of better quality and in adequate quantities to meet demand national. In Tapira Mining Complex, Vale Fertilizantes, the phosphate concentrate is produced with content of 35.0% P2O5 from ore with content of about 8.0% P2O5, which are intended to supply Complex Industrial Uberaba and Araxá Minero Chemical Complex for the production of fertilizers. The industrial flotation step responsible for the recovery of P2O5 and hence the viability of the business is divided into the crumbly, grainy and ultrathin circuits, and, friable and granular concentrate comprise the conventional concentrated. Today only 14.7% of the mass which feeds the mill product becomes, the remainder being considered losses in the process, and the larger mass losses are located in the waste of flotation, representing 42.3%. From 2012 to 2014, the daily global mass recovery processing plants varied from 12.4 to 15.9% while the daily metallurgical recovery of P2O5 from 48.7 to 82.4%. By the degree of variability, it appears that the plant operated under different conditions. Seen this, this study aimed to analyze the influence of operational and process variables in P2O5 mass and metallurgical recoveries of industrial flotation circuits of grainy, crumbly and ultrathin. And besides was made an analysis of the effect of ore variables, as degrees, hardnesse and the ore front 02 percentage, in global recoveries of processing plant and the effect of dosages of reagents in the recoveries obtained from the bench flotation using the experimental design methodology. All work was performed using the historical database of Vale Fertilizantes of Tapira-MG, where all independent variables were dimensionless as the experimental range used. To make the statistical analysis it used the response surface technique and the values of the independent variables that maximize recoveries were found by canonical analysis. In the study of industrial flotation circuit crispy were obtained from 41.3% mass recovery and 91.3% metallurgical recovery P2O5, good values for the circuit, and the highest recoveries occur for solids concentration of the new flotation power between 45 and 50%, which values are assigned to the residence time of the pulp in cells and industrial flotation columns. The greater the number of ore heaps resumed on the higher the mass recovery, but in this scenario flotation becomes unstable because there is enormous weight variation in the feed. Higher recoveries are obtained for mass depressant dosage exceeding 120 g / t for synthetic collector dosage of 11.6%. In the study of industrial flotation circuit of the granulate were obtained 28.3% to 79.4% mass recovery and metallurgical recovery of P2O5 being considered good values for the circuit. Higher recoveries are obtained when the front ore 02 percentage is above 90%, because the ore of this front have more clear apatite. Likewise recoveries increase when the level of pulp rougher step is highest due to the high mass of circulating step receives loads. In the analysis of industrial flotation circuit of the ultrafine were obtained 23.95% of mass recovery, and the same is maximized to depressant dosage and the top collector 420 and 300 g / t, respectively. The analysis of the influence of variables ore, it was observed that higher recoveries are obtained for ores with P2O5 content above 8.0%, Fe2O3 content in the order of 28% forward and 02 of ore percentage of 83%. Hard ore percentage has strong influence on recoveries due to mass division in the circuit that is linked to this variable. However, the hard ore percentage that maximizes recoveries was very close to the design capacity of the processing plant, which is 20%. Finally, the study of the bench flotation, has noted that in friable and granular circuits the highest recoveries are achieved for a collector dosage exceeding 250 g / t and the simultaneous increase of collector dosage and synthetic collector percentage contributes to the increase recovery in the flotation, but this scenario is suitable to produce a concentrate poorer in terms of P2O5 content, showing that higher recovery is not always the ideal scenario. Thus, the results show the values of variables that provide higher recoveries in the flotation and hence lower losses in the waste.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’intégration des technologies de l’information et de la communication (TIC) en contexte éducatif représente un moyen concret d’action et de réflexion en sciences de l’éducation. Les scientifiques et les acteurs de terrain se questionnent sur l’intégration des technologies et sur les moyens à mettre en place afin de réussir ce processus parfois complexe. De fait, la pénétration des outils technologiques dans les établissements scolaires a été exponentielle ces dernières années. Il est aujourd’hui nécessaire de comprendre selon quelles perspectives ces outils s’intègrent en salle de classe. Un exemple marquant est celui de la tablette tactile, récemment intégrée massivement dans les écoles d’Amérique du Nord et d’Europe. Cet outil, relativement récent dans la sphère scolaire, demande une réflexion précise vis-à-vis des pratiques pédagogiques des enseignants et des processus d’intégration inhérents. Afin de répondre à ces questionnements, nous avons élaboré une recherche en trois temps. Dans un premier temps, nous avons dressé un portrait exhaustif des pratiques pédagogiques des enseignants utilisant quotidiennement la tablette tactile en salle de classe. Ce portrait nous permet d’esquisser une synthèse des usages et réalités pédagogiques qui entourent cet outil. Dans un deuxième temps, nous avons répertorié, analysé et classifié les modèles d’intégration des TIC présents dans la littérature. L’analyse de ces modèles nous a permis d’en extraire les forces et les lacunes intrinsèques. Ensuite, nous avons créé un modèle synthèse rassemblant les réflexions issues de ces analyses. En parallèle, nous avons créé une typologie permettant d’identifier et de classifier ces modèles. Dans un troisième temps, nous sommes partis des pratiques pédagogiques des enseignants et du modèle général d’intégration des TIC que nous avons conçu afin de comprendre quel était le processus d’intégration de la tablette en salle de classe. Les résultats obtenus mettent en évidence que l’utilisation de la tablette induit des usages pédagogiques novateurs qui facilitent l’enseignement et qui favorisent l’apprentissage des élèves. Cependant, nous constatons que la tablette n’est pas utilisée à son plein potentiel et que certains usages devraient être envisagés selon une perspective plus efficiente et adaptée. En ce qui concerne les processus d’intégration, nous avons identifié plusieurs éléments indispensables: ces processus doivent être itératifs et constructifs, des facteurs internes et externes doivent être considérés et des niveaux d’intégration doivent être identifiés. Le modèle ainsi conçu spécifie le modèle à privilégier et les aboutissants à considérer. À la suite de cette étape, nous avons conçu un modèle d’intégration spécifiquement dédié à la tablette. Celui-ci met en évidence, au-delà des caractéristiques définies dans le modèle général, une nécessaire formation, une implication des acteurs, un ajustement constant des pratiques pédagogiques et une itération indispensable. À la suite de ces considérations, nous constatons que le processus d’appropriation de la tablette est en cours de construction et que les nouvelles implantations, comme les existantes, doivent procéder à une analyse minutieuse des tenants et aboutissants des pratiques médiées par l’intégration de l’outil. En fin de document, une synthèse des résultats et des recommandations est proposée afin de favoriser l’intégration de la tablette tactile - et des TIC en général – dans la salle de classe.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Incumbent telecommunication lasers emitting at 1.5 µm are fabricated on InP substrates and consist of multiple strained quantum well layers of the ternary alloy InGaAs, with barriers of InGaAsP or InGaAlAs. These lasers have been seen to exhibit very strong temperature dependence of the threshold current. This strong temperature dependence leads to a situation where external cooling equipment is required to stabilise the optical output power of these lasers. This results in a significant increase in the energy bill associated with telecommunications, as well as a large increase in equipment budgets. If the exponential growth trend of end user bandwidth demand associated with the internet continues, these inefficient lasers could see the telecommunications industry become the dominant consumer of world energy. For this reason there is strong interest in developing new, much more efficient telecommunication lasers. One avenue being investigated is the development of quantum dot lasers on InP. The confinement experienced in these low dimensional structures leads to a strong perturbation of the density of states at the band edge, and has been predicted to result in reduced temperature dependence of the threshold current in these devices. The growth of these structures is difficult due to the large lattice mismatch between InP and InAs; however, recently quantum dots elongated in one dimension, known as quantum dashes, have been demonstrated. Chapter 4 of this thesis provides an experimental analysis of one of these quantum dash lasers emitting at 1.5 µm along with a numerical investigation of threshold dynamics present in this device. Another avenue being explored to increase the efficiency of telecommunications lasers is bandstructure engineering of GaAs-based materials to emit at 1.5 µm. The cause of the strong temperature sensitivity in InP-based quantum well structures has been shown to be CHSH Auger recombination. Calculations have shown and experiments have verified that the addition of bismuth to GaAs strongly reduces the bandgap and increases the spin orbit splitting energy of the alloy GaAs1−xBix. This leads to a bandstructure condition at x = 10 % where not only is 1.5 µm emission achieved on GaAs-based material, but also the bandstructure of the material can naturally suppress the costly CHSH Auger recombination which plagues InP-based quantum-well-based material. It has been predicted that telecommunications lasers based on this material system should operate in the absence of external cooling equipment and offer electrical and optical benefits over the incumbent lasers. Chapters 5, 6, and 7 provide a first analysis of several aspects of this material system relevant to the development of high bismuth content telecommunication lasers.