962 resultados para System complexity
Resumo:
System thinking allows companies to use subjective constructs indicators like recursiveness, cause-effect relationships and autonomy to performance evaluation. Thus, the question that motivates this paper is: Are Brazilian companies searching new performance measurement and evaluation models based on system thinking? The study investigates models looking for system thinking roots in their framework. It was both exploratory and descriptive based on a multiple four case studies strategy in chemical sector. The findings showed organizational models have some characteristics that can be related to system thinking as system control and communication. Complexity and autonomy are deficiently formalized by the companies. All data suggest, inside its context, that system thinking seems to be adequate to organizational performance evaluation but remains distant from the management proceedings.
Resumo:
Abstract Background The public health system of Brazil is structured by a network of increasing complexity, but the low resolution of emergency care at pre-hospital units and the lack of organization of patient flow overloaded the hospitals, mainly the ones of higher complexity. The knowledge of this phenomenon induced Ribeirão Preto to implement the Medical Regulation Office and the Mobile Emergency Attendance System. The objective of this study was to analyze the impact of these services on the gravity profile of non-traumatic afflictions in a University Hospital. Methods The study conducted a retrospective analysis of the medical records of 906 patients older than 13 years of age who entered the Emergency Care Unit of the Hospital of the University of São Paulo School of Medicine at Ribeirão Preto. All presented acute non-traumatic afflictions and were admitted to the Internal Medicine, Surgery or Neurology Departments during two study periods: May 1996 (prior to) and May 2001 (after the implementation of the Medical Regulation Office and Mobile Emergency Attendance System). Demographics and mortality risk levels calculated by Acute Physiology and Chronic Health Evaluation II (APACHE II) were determined. Results From 1996 to 2001, the mean age increased from 49 ± 0.9 to 52 ± 0.9 (P = 0.021), as did the percentage of co-morbidities, from 66.6 to 77.0 (P = 0.0001), the number of in-hospital complications from 260 to 284 (P = 0.0001), the mean calculated APACHE II mortality risk increased from 12.0 ± 0.5 to 14.8 ± 0.6 (P = 0.0008) and mortality rate from 6.1 to 12.2 (P = 0.002). The differences were more significant for patients admitted to the Internal Medicine Department. Conclusion The implementation of the Medical Regulation and Mobile Emergency Attendance System contributed to directing patients with higher gravity scores to the Emergency Care Unit, demonstrating the potential of these services for hierarchical structuring of pre-hospital networks and referrals.
Resumo:
Providing support for multimedia applications on low-power mobile devices remains a significant research challenge. This is primarily due to two reasons: • Portable mobile devices have modest sizes and weights, and therefore inadequate resources, low CPU processing power, reduced display capabilities, limited memory and battery lifetimes as compared to desktop and laptop systems. • On the other hand, multimedia applications tend to have distinctive QoS and processing requirementswhichmake themextremely resource-demanding. This innate conflict introduces key research challenges in the design of multimedia applications and device-level power optimization. Energy efficiency in this kind of platforms can be achieved only via a synergistic hardware and software approach. In fact, while System-on-Chips are more and more programmable thus providing functional flexibility, hardwareonly power reduction techniques cannot maintain consumption under acceptable bounds. It is well understood both in research and industry that system configuration andmanagement cannot be controlled efficiently only relying on low-level firmware and hardware drivers. In fact, at this level there is lack of information about user application activity and consequently about the impact of power management decision on QoS. Even though operating system support and integration is a requirement for effective performance and energy management, more effective and QoSsensitive power management is possible if power awareness and hardware configuration control strategies are tightly integratedwith domain-specificmiddleware services. The main objective of this PhD research has been the exploration and the integration of amiddleware-centric energymanagement with applications and operating-system. We choose to focus on the CPU-memory and the video subsystems, since they are the most power-hungry components of an embedded system. A second main objective has been the definition and implementation of software facilities (like toolkits, API, and run-time engines) in order to improve programmability and performance efficiency of such platforms. Enhancing energy efficiency and programmability ofmodernMulti-Processor System-on-Chips (MPSoCs) Consumer applications are characterized by tight time-to-market constraints and extreme cost sensitivity. The software that runs on modern embedded systems must be high performance, real time, and even more important low power. Although much progress has been made on these problems, much remains to be done. Multi-processor System-on-Chip (MPSoC) are increasingly popular platforms for high performance embedded applications. This leads to interesting challenges in software development since efficient software development is a major issue for MPSoc designers. An important step in deploying applications on multiprocessors is to allocate and schedule concurrent tasks to the processing and communication resources of the platform. The problem of allocating and scheduling precedenceconstrained tasks on processors in a distributed real-time system is NP-hard. There is a clear need for deployment technology that addresses thesemulti processing issues. This problem can be tackled by means of specific middleware which takes care of allocating and scheduling tasks on the different processing elements and which tries also to optimize the power consumption of the entire multiprocessor platform. This dissertation is an attempt to develop insight into efficient, flexible and optimalmethods for allocating and scheduling concurrent applications tomultiprocessor architectures. It is a well-known problem in literature: this kind of optimization problems are very complex even in much simplified variants, therefore most authors propose simplified models and heuristic approaches to solve it in reasonable time. Model simplification is often achieved by abstracting away platform implementation ”details”. As a result, optimization problems become more tractable, even reaching polynomial time complexity. Unfortunately, this approach creates an abstraction gap between the optimization model and the real HW-SW platform. The main issue with heuristic or, more in general, with incomplete search is that they introduce an optimality gap of unknown size. They provide very limited or no information on the distance between the best computed solution and the optimal one. The goal of this work is to address both abstraction and optimality gaps, formulating accurate models which accounts for a number of ”non-idealities” in real-life hardware platforms, developing novel mapping algorithms that deterministically find optimal solutions, and implementing software infrastructures required by developers to deploy applications for the targetMPSoC platforms. Energy Efficient LCDBacklightAutoregulation on Real-LifeMultimediaAp- plication Processor Despite the ever increasing advances in Liquid Crystal Display’s (LCD) technology, their power consumption is still one of the major limitations to the battery life of mobile appliances such as smart phones, portable media players, gaming and navigation devices. There is a clear trend towards the increase of LCD size to exploit the multimedia capabilities of portable devices that can receive and render high definition video and pictures. Multimedia applications running on these devices require LCD screen sizes of 2.2 to 3.5 inches andmore to display video sequences and pictures with the required quality. LCD power consumption is dependent on the backlight and pixel matrix driving circuits and is typically proportional to the panel area. As a result, the contribution is also likely to be considerable in future mobile appliances. To address this issue, companies are proposing low power technologies suitable for mobile applications supporting low power states and image control techniques. On the research side, several power saving schemes and algorithms can be found in literature. Some of them exploit software-only techniques to change the image content to reduce the power associated with the crystal polarization, some others are aimed at decreasing the backlight level while compensating the luminance reduction by compensating the user perceived quality degradation using pixel-by-pixel image processing algorithms. The major limitation of these techniques is that they rely on the CPU to perform pixel-based manipulations and their impact on CPU utilization and power consumption has not been assessed. This PhDdissertation shows an alternative approach that exploits in a smart and efficient way the hardware image processing unit almost integrated in every current multimedia application processors to implement a hardware assisted image compensation that allows dynamic scaling of the backlight with a negligible impact on QoS. The proposed approach overcomes CPU-intensive techniques by saving system power without requiring either a dedicated display technology or hardware modification. Thesis Overview The remainder of the thesis is organized as follows. The first part is focused on enhancing energy efficiency and programmability of modern Multi-Processor System-on-Chips (MPSoCs). Chapter 2 gives an overview about architectural trends in embedded systems, illustrating the principal features of new technologies and the key challenges still open. Chapter 3 presents a QoS-driven methodology for optimal allocation and frequency selection for MPSoCs. The methodology is based on functional simulation and full system power estimation. Chapter 4 targets allocation and scheduling of pipelined stream-oriented applications on top of distributed memory architectures with messaging support. We tackled the complexity of the problem by means of decomposition and no-good generation, and prove the increased computational efficiency of this approach with respect to traditional ones. Chapter 5 presents a cooperative framework to solve the allocation, scheduling and voltage/frequency selection problem to optimality for energyefficient MPSoCs, while in Chapter 6 applications with conditional task graph are taken into account. Finally Chapter 7 proposes a complete framework, called Cellflow, to help programmers in efficient software implementation on a real architecture, the Cell Broadband Engine processor. The second part is focused on energy efficient software techniques for LCD displays. Chapter 8 gives an overview about portable device display technologies, illustrating the principal features of LCD video systems and the key challenges still open. Chapter 9 shows several energy efficient software techniques present in literature, while Chapter 10 illustrates in details our method for saving significant power in an LCD panel. Finally, conclusions are drawn, reporting the main research contributions that have been discussed throughout this dissertation.
Resumo:
Myc is a transcription factor that can activate transcription of several hundreds genes by direct binding to their promoters at specific DNA sequences (E-box). However, recent studies have also shown that it can exert its biological role by repressing transcription. Such studies collectively support a model in which c-Myc-mediated repression occurs through interactions with transcription factors bound to promoter DNA regions but not through direct recognition of typical E-box sequences. Here, we investigated whether N-Myc can also repress gene transcription, and how this is mechanistically achieved. We used human neuroblastoma cells as a model system in that N-MYC amplification/over-expression represents a key prognostic marker of this tumour. By means of transcription profile analyses we could identify at least 5 genes (TRKA, p75NTR, ABCC3, TG2, p21) that are specifically repressed by N-Myc. Through a dual-step-ChIP assay and genetic dissection of gene promoters, we found that N-Myc is physically associated with gene promoters in vivo, in proximity of the transcription start site. N-Myc association with promoters requires interaction with other proteins, such as Sp1 and Miz1 transcription factors. Furthermore, we found that N-Myc may repress gene expression by interfering directly with Sp1 and/or with Miz1 activity (i.e. TRKA, p75NTR, ABCC3, p21) or by recruiting Histone Deacetylase 1 (Hdac1) (i.e. TG2). In vitro analyses show that distinct N-Myc domains can interact with Sp1, Miz1 and Hdac1, supporting the idea that Myc may participate in distinct repression complexes by interacting specifically with diverse proteins. Finally, results show that N-Myc, through repressed genes, affects important cellular functions, such as apoptosis, growth, differentiation and motility. Overall, our results support a model in which N-Myc, like c-Myc, can repress gene transcription by direct interaction with Sp1 and/or Miz1, and provide further lines of evidence on the importance of transcriptional repression by Myc factors in tumour biology.
Resumo:
Survivin, a unique member of the family of inhibitors of apoptosis (IAP) proteins, orchestrates intracellular pathways during cell division and apoptosis. Its central regulatory function in vertebrate molecular pathways as mitotic regulator and inhibitor of apoptotic cell death has major implications for tumor cell proliferation and viability, and has inspired several approaches that target survivin for cancer therapy. Analyses in early-branching Metazoa so far propose an exclusive role of survivin as a chromosomal passenger protein, whereas only later during evolution the second, complementary antiapoptotic function might have arisen, concurrent with increased organismal complexity. To lift the veil on the ancestral function(s) of this key regulatory molecule, a survivin homologue of the phylogenetically oldest extant metazoan taxon (phylum Porifera) was identified and functionally characterized. SURVL of the demosponge Suberites domuncula shares significant similarities with its metazoan homologues, ranging from conserved exon/intron structures to the presence of localization signal and protein-interaction domains, characteristic of IAP proteins. Whereas sponge tissue displayed a very low steady-state level, SURVL expression was significantly up-regulated in rapidly proliferating primmorph cells. In addition, challenge of sponge tissue and primmorphs with cadmium and the lipopeptide Pam3Cys-Ser-(Lys)4 stimulated SURVL expression, concurrent with the expression of newly discovered poriferan caspases (CASL and CASL2). Complementary functional analyses in transfected HEK-293 revealed that heterologous expression of poriferan survivin in human cells not only promotes cell proliferation but also augments resistance to cadmium-induced cell death. Taken together, these results demonstrate both a deep evolutionary conserved and fundamental dual role of survivin, and an equally conserved central position of this key regulatory molecule in interconnected pathways of cell cycle and apoptosis. Additionally, SDCASL, SDCASL2, and SDTILRc (TIR-LRR containing protein) may represent new components of the innate defense sentinel in sponges. SDCASL and SDCASL2 are two new caspase-homolog proteins with a singular structure. In addition to their CASc domains, SDCASL and SDCASL2 feature a small prodomain NH2-terminal (effector caspases) and a remarkably long COOH-terminal domain containing one or several functional double stranded RNA binding domains (dsrm). This new caspase prototype can characterize a caspase specialization coupling pathogen sensing and apoptosis, and could represent a very efficient defense mechanism. SDTILRc encompasses also a unique combination of domains: several leucine rich repeats (LRR) and a Toll/IL-1 receptor (TIR) domain. This unusual domain association may correspond to a new family of intracellular sensing protein, forming a subclass of pattern recognition receptors (PRR).
Resumo:
Theatralität ist ein gängiges Konzept, um Theater in Afrika zu definieren. Wird dieses Konzept angewendet, so treten die historischen Unterschiede zwischen den verschiedenen Theaterformen in den Hintergrund. Deshalb ist es wichtig, Theater in einen kulturellen Kontext zu stellen, aus dem das Theater entsteht. Dadurch können nationale und internationale Suprastrukturen, die die sozialpolitische und wirtschaftliche Atmosphäre bestimmen, analysiert werden,. Da sich die aktuelle „globale“ Entwicklung auf neoliberale Grundsätze stützt, ist es offensichtlich, dass man Theater nicht diskutieren kann, ohne näher auf Neoliberalismus, Imperialismus, Kapitalismus, Entwicklungshilfe und Geberpolitik einzugehen.rnDerzeit werden die meisten Theaterprojekte in Tansania durch die Entwicklungshilfe oder ausländische Geberorganisationen unterstützt. Diese Organisationen stellen finanzielle Mittel zur Verfügung, um Theaterproduktionen auf unterschiedlichem Niveau zu ermöglichen. Diese Spendenpraxis hat zu der Fehlannahme geführt, dass Theater nur dann ein Theater ist, wenn es durch ausländische Organisationen finanziert wird. Jedoch ist es offensichtlich, dass diese finanziellen Mittel eine große Rolle in der Machtpolitik spielen. Diese Studie untersucht deshalb die Frage: Welchen Einfluss hat die neoliberale Politik, insbesondere durch die Entwicklungshilfe, auf das Theater in Tansania? Die Arbeit deckt einmal die Verbindung zwischen dem produzieren Theater und den verschiedenen dominierenden politischen Richtungen – von Nationalismus bis Neoliberalismus – auf. Darüber hinaus wird gezeigt, dass diese Verbindungen es dem Theater erschweren, diese Suprastrukturen zu vermeiden, durch die es finanziert wird. Das bedeutet, dass die neoliberale Politik mit seinen Merkmalen von Einengung, Unterdrückung und Ausbeutung auch ein eingeengtes, unterdrücktes und ausbeuterisches Theater hervorbringt. Dieser Studie bezeichnet ein solches Theater als Theater (Neo-)Liberalismus. Es ist ein Theater, das apolitisch auftritt, aber tatsächlich unter der neoliberal Politik des freien Markts und der Subventionsstreichungen ums Überleben kämpft.rnIndem diese Verbindungen zwischen Theater, Entwicklungshilfe und Geberorganisationen erläutert werden, kommt diese Forschung zu folgendem Ergebnis: Die Geberorganisationen haben kein Recht, unabhängig von der Höhe ihrer Spende, in die Souveränität eines Staates einzugreifen oder ein neues System einzuführen. Deshalb sollte die Loslösung von ausländischen Geberländern an erster Stelle stehen, damit sich das Theater ganz entwickeln und unabhängig überleben kann. Es ist deshalb notwendig, das Konzept des Volkstheaters neu zu definieren. Das Theater soll wieder mit den Initiativen von Menschen zu tun haben und ihre eigenen Themen in einem gewissen zeitlich und räumlichen Rahmen ansprechen.rnrn
Resumo:
Die vorliegende Arbeit behandelt die Entwicklung und Verbesserung von linear skalierenden Algorithmen für Elektronenstruktur basierte Molekulardynamik. Molekulardynamik ist eine Methode zur Computersimulation des komplexen Zusammenspiels zwischen Atomen und Molekülen bei endlicher Temperatur. Ein entscheidender Vorteil dieser Methode ist ihre hohe Genauigkeit und Vorhersagekraft. Allerdings verhindert der Rechenaufwand, welcher grundsätzlich kubisch mit der Anzahl der Atome skaliert, die Anwendung auf große Systeme und lange Zeitskalen. Ausgehend von einem neuen Formalismus, basierend auf dem großkanonischen Potential und einer Faktorisierung der Dichtematrix, wird die Diagonalisierung der entsprechenden Hamiltonmatrix vermieden. Dieser nutzt aus, dass die Hamilton- und die Dichtematrix aufgrund von Lokalisierung dünn besetzt sind. Das reduziert den Rechenaufwand so, dass er linear mit der Systemgröße skaliert. Um seine Effizienz zu demonstrieren, wird der daraus entstehende Algorithmus auf ein System mit flüssigem Methan angewandt, das extremem Druck (etwa 100 GPa) und extremer Temperatur (2000 - 8000 K) ausgesetzt ist. In der Simulation dissoziiert Methan bei Temperaturen oberhalb von 4000 K. Die Bildung von sp²-gebundenem polymerischen Kohlenstoff wird beobachtet. Die Simulationen liefern keinen Hinweis auf die Entstehung von Diamant und wirken sich daher auf die bisherigen Planetenmodelle von Neptun und Uranus aus. Da das Umgehen der Diagonalisierung der Hamiltonmatrix die Inversion von Matrizen mit sich bringt, wird zusätzlich das Problem behandelt, eine (inverse) p-te Wurzel einer gegebenen Matrix zu berechnen. Dies resultiert in einer neuen Formel für symmetrisch positiv definite Matrizen. Sie verallgemeinert die Newton-Schulz Iteration, Altmans Formel für beschränkte und nicht singuläre Operatoren und Newtons Methode zur Berechnung von Nullstellen von Funktionen. Der Nachweis wird erbracht, dass die Konvergenzordnung immer mindestens quadratisch ist und adaptives Anpassen eines Parameters q in allen Fällen zu besseren Ergebnissen führt.
Resumo:
In these last years, systems engineering has became one of the major research domains. The complexity of systems has increased constantly and nowadays Cyber-Physical Systems (CPS) are a category of particular interest: these, are systems composed by a cyber part (computer-based algorithms) that monitor and control some physical processes. Their development and simulation are both complex due to the importance of the interaction between the cyber and the physical entities: there are a lot of models written in different languages that need to exchange information among each other. Normally people use an orchestrator that takes care of the simulation of the models and the exchange of informations. This orchestrator is developed manually and this is a tedious and long work. Our proposition is to achieve to generate the orchestrator automatically through the use of Co-Modeling, i.e. by modeling the coordination. Before achieving this ultimate goal, it is important to understand the mechanisms and de facto standards that could be used in a co-modeling framework. So, I studied the use of a technology employed for co-simulation in the industry: FMI. In order to better understand the FMI standard, I realized an automatic export, in the FMI format, of the models realized in an existing software for discrete modeling: TimeSquare. I also developed a simple physical model in the existing open source openmodelica tool. Later, I started to understand how works an orchestrator, developing a simple one: this will be useful in future to generate an orchestrator automatically.
Resumo:
Ventricular assist devices (VADs) and total artificial hearts have been in development for the last 50 years. Since their inception, simulators of the circulation with different degrees of complexity have been produced to test these devices in vitro. Currently, a new path has been taken with the extensive efforts to develop paediatric VADs, which require totally different design constraints. This paper presents the manufacturing details of an economical simulator of the systemic paediatric circulation. This simulator allows the insertion of a paediatric VAD, includes a pumping ventricle, and is adjustable within the paediatric range. Rather than focusing on complexity and physiological simulation, this simulator is designed to be simple and practical for rapid device testing. The simulator was instrumented with medical sensors and data were acquired under different conditions with and without the new PediaFlowTM paediatric VAD. The VAD was run at different impeller speeds while simulator settings such as vascular resistance and stroke volume were varied. The hydraulic performance of the VAD under pulsatile conditions could be characterized and the magnetic suspension could be tested via manipulations such as cannula clamping. This compact mock loop has proven to be valuable throughout the PediaFlow development process and has the advantage that it is uncomplicated and can be manufactured cheaply. It can be produced by several research groups and the results of different VADs can then be compared easily.
Resumo:
P>1. Proliferative kidney disease (PKD) is a disease of salmonid fish caused by the endoparasitic myxozoan, Tetracapsuloides bryosalmonae, which uses freshwater bryozoans as primary hosts. Clinical PKD is characterised by a temperature-dependent proliferative and inflammatory response to parasite stages in the kidney.;2. Evidence that PKD is an emerging disease includes outbreaks in new regions, declines in Swiss brown trout populations and the adoption of expensive practices by fish farms to reduce heavy losses. Disease-related mortality in wild fish populations is almost certainly underestimated because of e.g. oversight, scavenging by wild animals, misdiagnosis and fish stocking.;3. PKD prevalences are spatially and temporally variable, range from 0 to 90-100% and are typically highest in juvenile fish.;4. Laboratory and field studies demonstrate that (i) increasing temperatures enhance disease prevalence, severity and distribution and PKD-related mortality; (ii) eutrophication may promote outbreaks. Both bryozoans and T. bryosalmonae stages in bryozoans undergo temperature- and nutrient-driven proliferation.;5. Tetracapsuloides bryosalmonae is likely to achieve persistent infection of highly clonal bryozoan hosts through vertical transmission, low virulence and host condition-dependent cycling between covert and overt infections. Exploitation of fish hosts entails massive proliferation and spore production by stages that escape the immune response. Many aspects of the parasite's life cycle remain obscure. If infectious stages are produced in all hosts then the complex life cycle includes multiple transmission routes.;6. Patterns of disease outbreaks suggest that background, subclinical infections exist under normal environmental conditions. When conditions change, outbreaks may then occur in regions where infection was hitherto unsuspected.;7. Environmental change is likely to cause PKD outbreaks in more northerly regions as warmer temperatures promote disease development, enhance bryozoan biomass and increase spore production, but may also reduce the geographical range of this unique multihost-parasite system. Coevolutionary dynamics resulting from host-parasite interactions that maximise fitness in previous environments may pose problems for sustainability, particularly in view of extensive declines in salmonid populations and degradation of many freshwater habitats.
Resumo:
Successful software systems cope with complexity by organizing classes into packages. However, a particular organization may be neither straightforward nor obvious for a given developer. As a consequence, classes can be misplaced, leading to duplicated code and ripple effects with minor changes effecting multiple packages. We claim that contextual information is the key to rearchitecture a system. Exploiting contextual information, we propose a technique to detect misplaced classes by analyzing how client packages access the classes of a given provider package. We define locality as a measure of the degree to which classes reused by common clients appear in the same package. We then use locality to guide a simulated annealing algorithm to obtain optimal placements of classes in packages. The result is the identification of classes that are candidates for relocation. We apply the technique to three applications and validate the usefulness of our approach via developer interviews.
Resumo:
This paper focuses on adolescents who live in divided societies and how they navigate those divisions as they develop as civic actors. The study sites are Northern Ireland, South Africa, and the United States. In each setting we collected surveys, conducted focus groups with teachers and students, and followed students through the 9th and 10th grades in a case study classroom. In all locales, the students used materials from Facing History and Ourselves, and their teachers had participated in workshops on using those materials. In this paper we follow a case study student from the United States who provides a particularly complex look at issues of division and ethical civic development. The student, Pete, is a white immigrant from South Africa, studying in a multi-ethnic and multi-racial school in the United States. He confronts his South African legacies in the context of a foreign school system, which is working to help U.S. students confront their own legacies. Across two, one-semester, citizenship classes, Pete shows us the tension between an academic stance and a moral/emotional stance. When moral dilemmas become complex for him, he begins to lose his ability to judge. Teacher support and guidance is critical to help students like Pete learn to hold their moral ground, while understanding why others act as they do.
Resumo:
Image-guided, computer-assisted neurosurgery has emerged to improve localization and targeting, to provide a better anatomic definition of the surgical field, and to decrease invasiveness. Usually, in image-guided surgery, a computer displays the surgical field in a CT/MR environment, using axial, coronal or sagittal views, or even a 3D representation of the patient. Such a system forces the surgeon to look away from the surgical scene to the computer screen. Moreover, this kind of information, being pre-operative imaging, can not be modified during the operation, so it remains valid for guidance in the first stage of the surgical procedure, and mainly for rigid structures like bones. In order to solve the two constraints mentioned before, we are developing an ultrasoundguided surgical microscope. Such a system takes the advantage that surgical microscopy and ultrasound systems are already used in neurosurgery, so it does not add more complexity to the surgical procedure. We have integrated an optical tracking device in the microscope and an augmented reality overlay system with which we avoid the need to look away from the scene, providing correctly aligned surgical images with sub-millimeter accuracy. In addition to the standard CT and 3D views, we are able to track an ultrasound probe, and using a previous calibration and registration of the imaging, the image obtained is correctly projected to the overlay system, so the surgeon can always localize the target and verify the effects of the intervention. Several tests of the system have been already performed to evaluate the accuracy, and clinical experiments are currently in progress in order to validate the clinical usefulness of the system.
Resumo:
Although assessment of asthma control is important to guide treatment, it is difficult since the temporal pattern and risk of exacerbations are often unpredictable. In this Review, we summarise the classic methods to assess control with unidimensional and multidimensional approaches. Next, we show how ideas from the science of complexity can explain the seemingly unpredictable nature of bronchial asthma and emphysema, with implications for chronic obstructive pulmonary disease. We show that fluctuation analysis, a method used in statistical physics, can be used to gain insight into asthma as a dynamic disease of the respiratory system, viewed as a set of interacting subsystems (eg, inflammatory, immunological, and mechanical). The basis of the fluctuation analysis methods is the quantification of the long-term temporal history of lung function parameters. We summarise how this analysis can be used to assess the risk of future asthma episodes, with implications for asthma severity and control both in children and adults.
Resumo:
The immune system faces a considerable challenge in its efforts to maintain tissue homeostasis in the intestinal mucosa. It is constantly confronted with a large array of antigens, and has to prevent the dissemination and proliferation of potentially harmful agents while sparing the vital structures of the intestine from immune-mediated destruction. Complex interactions between the highly adapted effector cells and mechanisms of the innate and adaptive immune system generally prevent the luminal microflora from penetrating the intestinal mucosa and from spreading systemically. Non-haematopoietic cells critically contribute to the maintenance of local tissue homeostasis in an antigen-rich environment by producing protective factors (e.g. production of mucus by goblet cells, or secretion of microbicidal defensins by Paneth cells) and also through interactions with the adaptive and innate immune system (such as the production of chemotactic factors that lead to the selective recruitment of immune cell subsets). The complexity of the regulatory mechanisms that control the local immune response to luminal antigens is also reflected in the observation that mutations in immunologically relevant genes often lead to the development of uncontrolled inflammatory reactions in the microbially colonized intestine of experimental animals.