888 resultados para paradigm shift
Resumo:
A new triterpene, 1-epi-castanopsol, besides eleven known compounds: sitosterol, stigmasterol, campesterol, lupeol, lupenone, simirane B, syringaresinol, scopoletin, isofraxidin, 6,7,8-trimethoxycoumarin and harman, were isolated from the wood of Simira glaziovii. The structures of the known compounds were defined by 1D, 2D ¹H, 13C NMR spectra data analyses and comparison with literature data. The detailed spectral data analyses allowed the definition of the structure of the new 1-epi isomer of castanopsol and performance of ¹H and 13C NMR chemical shift assignments.
Resumo:
The application of automated correlation optimized warping (ACOW) to the correction of retention time shift in the chromatographic fingerprints of Radix Puerariae thomsonii (RPT) was investigated. Twenty-seven samples were extracted from 9 batches of RPT products. The fingerprints of the 27 samples were established by the HPLC method. Because there is a retention time shift in the established fingerprints, the quality of these samples cannot be correctly evaluated by using similarity estimation and principal component analysis (PCA). Thus, the ACOW method was used to align these fingerprints. In the ACOW procedure, the warping parameters, which have a significant influence on the alignment result, were optimized by an automated algorithm. After correcting the retention time shift, the quality of these RPT samples was correctly evaluated by similarity estimation and PCA. It is demonstrated that ACOW is a practical method for aligning the chromatographic fingerprints of RPT. The combination of ACOW, similarity estimation, and PCA is shown to be a promising method for evaluating the quality of Traditional Chinese Medicine.
Resumo:
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.
Resumo:
The objective of this study is to explore how the Open Innovation paradigm is applied in by small and medium-size enterprises in Russia. The focus of the study is to understand how the processes of research and development and commercialization proceed in these kind of companies and to which extent they apply open innovation principles. Russian leadership makes certain steps for transition from the export of raw materials to an innovative model of economic growth. The research aims to disclose actual impact of these attempts. The closed innovation model and the erosion factors which lead to the destruction of an old one and emergence of new model are described. Features of open innovation implementation and intellectual property rights protection in small and medium enterprises are presented. To achieve the objective, a qualitative case study approach was chosen. Research includes facts and figures, views and opinions of management of studied companies related to innovation process in the company and in Russia in general. The research depicts the features of Open Innovation implementation by SMEs in Russia. A large number of research centers with necessary equipment and qualified personnel allow case companies to use external R&D effectively. They cooperate actively with research institutes, universities and laboratories. Thus, they apply inbound Open Innovation. On the contrary, lack of venture capital, low demand for technologies within the domestic market and weak protection of intellectual property limit the external paths to new markets. Licensing-out and creation of spin-off are isolated cases. Therefore, outbound Open Innovation is not a regular practice.
Resumo:
The term "complicated" diverticulitis is reserved for inflamed diverticular disease complicated by bleeding, abscess, peritonitis, fistula or bowel obstruction. Hemorrhage is best treated by angioembolization (interventional radiology). Treatment of infected diverticulitis has evolved enormously thanks to: 1) laparoscopic colonic resection followed or not (Hartmann's procedure) by restoration of intestinal continuity, 2) simple laparoscopic lavage (for peritonitis +/- resection). Diverticulitis (inflammation) may be treated with antibiotics alone, anti-inflammatory drugs, combined with bed rest and hygienic measures. Diverticular abscesses (Hinchey Grades I, II) may be initially treated by antibiotics alone and/or percutaneous drainage, depending on the size of the abscess. Generalized purulent peritonitis (Hinchey III) may be treated by the classic Hartmann procedure, or exteriorization of the perforation as a stoma, primary resection with or without anastomosis, with or without diversion, and last, simple laparoscopic lavage, usually even without drainage. Feculent peritonitis (Hinchey IV), a traditional indication for Hartmann's procedure, may also benefit from primary resection followed by anastomosis, with or without diversion, and even laparoscopic lavage. Acute obstruction (nearby inflammation, or adhesions, pseudotumoral formation, chronic strictures) and fistula are most often treated by resection, ideally laparoscopic. Minimal invasive therapeutic algorithms that, combined with less strict indications for radical surgery before a definite recurrence pattern is established, has definitely lead to fewer resections and/or stomas, reducing their attendant morbidity and mortality, improved post-interventional quality of life, and less costly therapeutic policies.
Resumo:
This doctoral dissertation investigates the adult education policy of the European Union (EU) in the framework of the Lisbon agenda 2000–2010, with a particular focus on the changes of policy orientation that occurred during this reference decade. The year 2006 can be considered, in fact, a turning point for the EU policy-making in the adult learning sector: a radical shift from a wide--ranging and comprehensive conception of educating adults towards a vocationally oriented understanding of this field and policy area has been observed, in particular in the second half of the so--called ‘Lisbon decade’. In this light, one of the principal objectives of the mainstream policy set by the Lisbon Strategy, that of fostering all forms of participation of adults in lifelong learning paths, appears to have muted its political background and vision in a very short period of time, reflecting an underlying polarisation and progressive transformation of European policy orientations. Hence, by means of content analysis and process tracing, it is shown that the new target of the EU adult education policy, in this framework, has shifted from citizens to workers, and the competence development model, borrowed from the corporate sector, has been established as the reference for the new policy road maps. This study draws on the theory of governance architectures and applies a post-ontological perspective to discuss whether the above trends are intrinsically due to the nature of the Lisbon Strategy, which encompasses education policies, and to what extent supranational actors and phenomena such as globalisation influence the European governance and decision--making. Moreover, it is shown that the way in which the EU is shaping the upgrading of skills and competences of adult learners is modeled around the needs of the ‘knowledge economy’, thus according a great deal of importance to the ‘new skills for new jobs’ and perhaps not enough to life skills in its broader sense which include, for example, social and civic competences: these are actually often promoted but rarely implemented in depth in the EU policy documents. In this framework, it is conveyed how different EU policy areas are intertwined and interrelated with global phenomena, and it is emphasised how far the building of the EU education systems should play a crucial role in the formation of critical thinking, civic competences and skills for a sustainable democratic citizenship, from which a truly cohesive and inclusive society fundamentally depend, and a model of environmental and cosmopolitan adult education is proposed in order to address the challenges of the new millennium. In conclusion, an appraisal of the EU’s public policy, along with some personal thoughts on how progress might be pursued and actualised, is outlined.
Resumo:
With the shift towards many-core computer architectures, dataflow programming has been proposed as one potential solution for producing software that scales to a varying number of processor cores. Programming for parallel architectures is considered difficult as the current popular programming languages are inherently sequential and introducing parallelism is typically up to the programmer. Dataflow, however, is inherently parallel, describing an application as a directed graph, where nodes represent calculations and edges represent a data dependency in form of a queue. These queues are the only allowed communication between the nodes, making the dependencies between the nodes explicit and thereby also the parallelism. Once a node have the su cient inputs available, the node can, independently of any other node, perform calculations, consume inputs, and produce outputs. Data ow models have existed for several decades and have become popular for describing signal processing applications as the graph representation is a very natural representation within this eld. Digital lters are typically described with boxes and arrows also in textbooks. Data ow is also becoming more interesting in other domains, and in principle, any application working on an information stream ts the dataflow paradigm. Such applications are, among others, network protocols, cryptography, and multimedia applications. As an example, the MPEG group standardized a dataflow language called RVC-CAL to be use within reconfigurable video coding. Describing a video coder as a data ow network instead of with conventional programming languages, makes the coder more readable as it describes how the video dataflows through the different coding tools. While dataflow provides an intuitive representation for many applications, it also introduces some new problems that need to be solved in order for data ow to be more widely used. The explicit parallelism of a dataflow program is descriptive and enables an improved utilization of available processing units, however, the independent nodes also implies that some kind of scheduling is required. The need for efficient scheduling becomes even more evident when the number of nodes is larger than the number of processing units and several nodes are running concurrently on one processor core. There exist several data ow models of computation, with different trade-offs between expressiveness and analyzability. These vary from rather restricted but statically schedulable, with minimal scheduling overhead, to dynamic where each ring requires a ring rule to evaluated. The model used in this work, namely RVC-CAL, is a very expressive language, and in the general case it requires dynamic scheduling, however, the strong encapsulation of dataflow nodes enables analysis and the scheduling overhead can be reduced by using quasi-static, or piecewise static, scheduling techniques. The scheduling problem is concerned with nding the few scheduling decisions that must be run-time, while most decisions are pre-calculated. The result is then an, as small as possible, set of static schedules that are dynamically scheduled. To identify these dynamic decisions and to find the concrete schedules, this thesis shows how quasi-static scheduling can be represented as a model checking problem. This involves identifying the relevant information to generate a minimal but complete model to be used for model checking. The model must describe everything that may affect scheduling of the application while omitting everything else in order to avoid state space explosion. This kind of simplification is necessary to make the state space analysis feasible. For the model checker to nd the actual schedules, a set of scheduling strategies are de ned which are able to produce quasi-static schedulers for a wide range of applications. The results of this work show that actor composition with quasi-static scheduling can be used to transform data ow programs to t many different computer architecture with different type and number of cores. This in turn, enables dataflow to provide a more platform independent representation as one application can be fitted to a specific processor architecture without changing the actual program representation. Instead, the program representation is in the context of design space exploration optimized by the development tools to fit the target platform. This work focuses on representing the dataflow scheduling problem as a model checking problem and is implemented as part of a compiler infrastructure. The thesis also presents experimental results as evidence of the usefulness of the approach.
Resumo:
Rio +20, or the United Nations Conference for Sustainable Development, will take place at the end of this month of June 2012. In this paper, our central argument is that Brazil, as the host of Rio+20, has a historic opportunity to make the conference a success and take a decisive step in becoming a world leader in the shift from the traditional development paradigm to a new, sustainable development paradigm. To do that, Brazil will have to resolve a paradox: on the one hand the country has modern legislation and world class science, and on the other hand very poor social and environmental decision-making in recent times. In this column, we examine the green economy as a trajectory that leads to sustainable development and describe some pilot experiences at the sub-national level in Brazil. We discuss how science, and particularly plant sciences, will be essential to the transition to sustainable development. Finally, we propose immediate actions that we call upon the Brazilian government to commit to and to announce during this pivotal Rio+20 moment, which should serve as a milestone for all nations in building a sustainable future.
Resumo:
Opiates have been implicated in learned helplessness (LH), a phenomenon known to be related to opiate stress-induced analgesia (SIA). In the present study, we investigated the role of opiates in the induction of LH and SIA under different conditions. Adult female Wistar rats were trained either by receiving 60 inescapable 1-mA footshocks (IS group, N = 114) or by confinement in the shock box (control or NS group, N = 92). The pain threshold of some of the animals was immediately evaluated in a tail-flick test while the rest were used 24 h later in a shuttle box experiment to examine their escape performance. The opiate antagonist naltrexone (0 or 8 mg/kg, ip) and the previous induction of cross-tolerance to morphine by the chronic administration of morphine (0 or 10 mg/kg, sc, for 13 days) were used to identify opiate involvement. Analysis of variance revealed that only animals in the IS group demonstrated antinociception and an escape deficit, both of which were resistant to the procedures applied before the training session. However, the escape deficit could be reversed if the treatments were given before the test session. We conclude that, under our conditions, induction of the LH deficit in escape performance is not opiate-mediated although its expression is opiate-modulated
Resumo:
The nucleolus is the cellular site of ribosome biosynthesis. At this site, active ribosomal DNA (rDNA) genes are rapidly transcribed by RNA polymerase I (pol I) molecules. Recent advances in our understanding of the pol I transcription system have indicated that regulation of ribosomal RNA (rRNA) synthesis is a critical factor in cell growth. Importantly, the same signaling networks that control cell growth and proliferation and are deregulated in cancer appear to control pol I transcription. Therefore, the study of the biochemical basis for growth regulation of pol I transcription can provide basic information about the nuclear signaling network. Hopefully, this information may facilitate the search for drugs that can inhibit the growth of tumor cells by blocking pol I activation. In addition to its function in ribosome biogenesis, recent studies have revealed the prominent role of the nucleolus in cell senescence. These findings have stimulated a new wave of research on the functional relationship between the nucleolus and aging. The aim of this review is to provide an overview of some current topics in the area of nucleolus biology, and it has been written for a general readership.
Resumo:
Manufacturing industry has been always facing challenge to improve the production efficiency, product quality, innovation ability and struggling to adopt cost-effective manufacturing system. In recent years cloud computing is emerging as one of the major enablers for the manufacturing industry. Combining the emerged cloud computing and other advanced manufacturing technologies such as Internet of Things, service-oriented architecture (SOA), networked manufacturing (NM) and manufacturing grid (MGrid), with existing manufacturing models and enterprise information technologies, a new paradigm called cloud manufacturing is proposed by the recent literature. This study presents concepts and ideas of cloud computing and cloud manufacturing. The concept, architecture, core enabling technologies, and typical characteristics of cloud manufacturing are discussed, as well as the difference and relationship between cloud computing and cloud manufacturing. The research is based on mixed qualitative and quantitative methods, and a case study. The case is a prototype of cloud manufacturing solution, which is software platform cooperated by ATR Soft Oy and SW Company China office. This study tries to understand the practical impacts and challenges that are derived from cloud manufacturing. The main conclusion of this study is that cloud manufacturing is an approach to achieve the transformation from traditional production-oriented manufacturing to next generation service-oriented manufacturing. Many manufacturing enterprises are already using a form of cloud computing in their existing network infrastructure to increase flexibility of its supply chain, reduce resources consumption, the study finds out the shift from cloud computing to cloud manufacturing is feasible. Meanwhile, the study points out the related theory, methodology and application of cloud manufacturing system are far from maturity, it is still an open field where many new technologies need to be studied.