977 resultados para Over-complete Discretewavelet Transformation
Resumo:
Performing a complete blood count analysis is a daily routine necessary for a good care of patients. Nowadays, modern blood analyzers provide on top of classical blood values, several additional parameters. In this paper, using short case presentations, we discuss how to interpret these results and integrate them in the clinical context.
Resumo:
Aim: To describe changes in leisure time and occupational physical activity status in an urban Mediterranean population-based cohort, and to evaluate sociodemographic, health-related and lifestyle correlates of such changes. Methods: Data for this study come from the Cornellè Health Interview Survey Follow-Up Study, a prospective cohort study of a representative sample (n¿=¿2500) of the population. Participants in the analysis reported here include 1246 subjects (567 men and 679 women) who had complete data on physical activity at the 1994 baseline survey and at the 2002 follow-up. We fitted Breslow-Cox regression models to assess the association between correlates of interest and changes in physical activity. Results: Regarding leisure time physical activity, 61.6% of cohort members with ¿sedentary¿ habits in 1994 changed their status to ¿light/moderate¿ physical activity in 2002, and 70% who had ¿light/moderate¿ habits in 1994 did not change their activity level. Regarding occupational physical activity, 74.4% of cohort members who were ¿active¿ did not change their level of activity, and 64.3% of participants with ¿sedentary¿ habits in 1994 changed to ¿active¿ occupational physical activity. No clear correlates of change in physical activity were identified in multivariate analyses. Conclusion: While changes in physical activity are evident in this population-based cohort, no clear determinants of such changes were recognised. Further longitudinal studies including other potential individual and contextual determinants are needed to better understand determinants of changes in physical activity at the population level.
Resumo:
We use the recently obtained theoretical expression for the complete QCD static energy at next-to-next-to-next-to leading-logarithmic accuracy to determine r(0)Lambda((MS) over bar) by comparison with available lattice data, where r(0) is the lattice scale and Lambda((MS) over bar) is the QCD scale. We obtain r(0)Lambda((MS) over bar) = 0.622(-0.015)(+0.019) for the zero-flavor case. The procedure we describe can be directly used to obtain r(0)Lambda((MS) over bar) in the unquenched case, when unquenched lattice data for the static energy at short distances becomes available. Using the value of the strong coupling alpha(s) as an input, the unquenched result would provide a determination of the lattice scale r(0).
Resumo:
The Complete Arabidopsis Transcriptome Micro Array (CATMA) database contains gene sequence tag (GST) and gene model sequences for over 70% of the predicted genes in the Arabidopsis thaliana genome as well as primer sequences for GST amplification and a wide range of supplementary information. All CATMA GST sequences are specific to the gene for which they were designed, and all gene models were predicted from a complete reannotation of the genome using uniform parameters. The database is searchable by sequence name, sequence homology or direct SQL query, and is available through the CATMA website at http://www.catma.org/.
Resumo:
Human T lymphocytes have a finite life span resulting from progressive telomere shortening that occurs at each cell division, eventually leading to chromosomal instability. It has been shown that ectopic expression of the human telomerase reverse transcriptase (hTERT) gene into various human cells results in the extension of their replicative life span, without inducing changes associated with transformation. However, it is still unclear whether cells that over-express telomerase are physiologically and biochemically indistinguishable from normal cells. To address this question, we compared the proteome of young and aged human CD8(+) T lymphocytes with that of T cells transduced with hTERT. Interestingly, we found no global changes in the protein pattern in young T cells, irrespective of telomerase expression. In contrast, several relevant proteins with differential expression patterns were observed in hTERT-transduced T cells with extended life span upon long-term culture. Altogether, our data revealed that T lymphocytes over-expressing telomerase displayed an intermediate protein pattern, sharing a similar protein expression not only with young T cells, but also with aged T cells. Finally, the results obtained from this global proteomic approach are in agreement with the overall gene transcription profiling performed on the same T-cell derived clones.
Resumo:
PURPOSE This prospective multicenter phase III study compared the efficacy and safety of a triple combination (bortezomib-thalidomide-dexamethasone [VTD]) versus a dual combination (thalidomide-dexamethasone [TD]) in patients with multiple myeloma (MM) progressing or relapsing after autologous stem-cell transplantation (ASCT). PATIENTS AND METHODS Overall, 269 patients were randomly assigned to receive bortezomib (1.3 mg/m(2) intravenous bolus) or no bortezomib for 1 year, in combination with thalidomide (200 mg per day orally) and dexamethasone (40 mg orally once a day on 4 days once every 3 weeks). Bortezomib was administered on days 1, 4, 8, and 11 with a 10-day rest period (day 12 to day 21) for eight cycles (6 months), and then on days 1, 8, 15, and 22 with a 20-day rest period (day 23 to day 42) for four cycles (6 months). Results Median time to progression (primary end point) was significantly longer with VTD than TD (19.5 v 13.8 months; hazard ratio, 0.59; 95% CI, 0.44 to 0.80; P = .001), the complete response plus near-complete response rate was higher (45% v 25%; P = .001), and the median duration of response was longer (17.2 v 13.4 months; P = .03). The 24-month survival rate was in favor of VTD (71% v 65%; P = .093). Grade 3 peripheral neuropathy was more frequent with VTD (29% v 12%; P = .001) as were the rates of grades 3 and 4 infection and thrombocytopenia. CONCLUSION VTD was more effective than TD in the treatment of patients with MM with progressive or relapsing disease post-ASCT but was associated with a higher incidence of grade 3 neurotoxicity.
Resumo:
Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.
Resumo:
This work describes the formation of transformation products (TPs) by the enzymatic degradation at laboratory scale of two highly consumed antibiotics: tetracycline (Tc) and erythromycin (ERY). The analysis of the samples was carried out by a fast and simple method based on the novel configuration of the on-line turbulent flow system coupled to a hybrid linear ion trap – high resolution mass spectrometer. The method was optimized and validated for the complete analysis of ERY, Tc and their transformation products within 10 min without any other sample manipulation. Furthermore, the applicability of the on-line procedure was evaluated for 25 additional antibiotics, covering a wide range of chemical classes in different environmental waters with satisfactory quality parameters. Degradation rates obtained for Tc by laccase enzyme and ERY by EreB esterase enzyme without the presence of mediators were ∼78% and ∼50%, respectively. Concerning the identification of TPs, three suspected compounds for Tc and five of ERY have been proposed. In the case of Tc, the tentative molecular formulas with errors mass within 2 ppm have been based on the hypothesis of dehydroxylation, (bi)demethylation and oxidation of the rings A and C as major reactions. In contrast, the major TP detected for ERY has been identified as the “dehydration ERY-A”, with the same molecular formula of its parent compound. In addition, the evaluation of the antibiotic activity of the samples along the enzymatic treatments showed a decrease around 100% in both cases
Resumo:
This study describes the use of electroporation for transforming Xanthomonas axonopodis pv. citri (Xac), the causal agent of citrus (Citrus spp.) canker. It also evaluates the methodology used for this species under different electrical parameters. The bacterium used in the study (Xac 306) was the same strain used for recent complete sequencing of the organism. The use of a plasmid (pUFR047, gentamycin r) is reported here to be able to replicate in cells of Xac. Following the preparation and resuspension of competent cells of Xac at a density of ~4 x 10(10) cfu/ml, in 10% glycerol, and the addition of the replicative plasmid, an electrical pulse was applied to each treatment. Selection of transformants showed a high efficiency of transformation (1.1 x 10(6) transformants/mug DNA), which indicates an effective, and inverse, combination between electrical resistance (50 W) and capacitance (50 µF) for this species, with an electrical field strength of 12.5 kV.cm-1 and 2.7-ms pulse duration. Besides the description of a method for electroporation of Xac 306, this study provides additional information for the use of the technique on studies for production of mutants of this species.
Resumo:
Nowadays biomass transformation has a great potential for the synthesis of value-added compounds with a wide range of applications. Terpenoids, extracted from biomass, are inexpensive and renewable raw materials which often have a biological activity and are widely used as important organic platform molecules in the development of new medicines as well as in the synthesis of fine chemicals and intermediates. At the same time, special attention is devoted to the application of gold catalysts to fine chemical synthesis due to their outstanding activity and/or selectivity for transformations of complex organic compounds. Conversion of renewable terpenoids in the presence of gold nanoparticles is one of the new and promising directions in the transformation of biomass to valuable chemicals. In the doctoral thesis, different kinds of natural terpenoids, such as α-pinene, myrtenol and carvone were selected as starting materials. Gold catalysts were utilized for the promising routes of these compounds transformation. Investigation of selective α-pinene isomerization to camphene, which is an important step in an industrial process towards the synthesis of camphor as well as other valuable substrates for the pharmaceutical industry, was performed. A high activity of heterogeneous gold catalysts in the Wagner-Meerwein rearrangement was demonstrated for the first time. Gold on alumina carrier was found to reach the α-pinene isomerization conversion up to 99.9% and the selectivity of 60-80%, thus making this catalyst very promising from an industrial viewpoint. A detailed investigation of kinetic regularities including catalyst deactivation during the reaction was performed. The one-pot terpene alcohol amination, which is a promising approach to the synthesis of valuable complex amines having specific physiological properties, was investigated. The general regularities of the one-pot natural myrtenol amination in the presence of gold catalysts as well as a correlation between catalytic activity, catalyst redox treatment and the support nature were obtained. Catalytic activity and product distribution were shown to be strongly dependent on the support properties, namely acidity and basicity. The gold-zirconia (Au/ZrO2) catalyst pretreated under oxidizing atmosphere was observed to be rather active, resulting in the total conversion of myrtenol and the selectivity to the corresponding amine of about 53%. The reaction kinetics was modelled based on the mechanistic considerations with the catalyst deactivation step incorporated in the mechanism. Carvone hydrogenation over a gold catalyst was studied with the general idea of investigating both the activity of gold catalysts in competitive hydrogenation of different functional groups and developing an approach to the synthesis of valuable carvone derivatives. Gold was found to promote stereo- and chemoselective carvone hydrogenation to dihydrocarvone with a predominant formation of the trans-isomer, which generally is a novel synthetic method for an industrially valuable dihydrocarvone. The solvent effect on the catalytic activity as well as on the ratio between trans- and cis-dihydrocarvone was evaluated.
Resumo:
A web service is a software system that provides a machine-processable interface to the other machines over the network using different Internet protocols. They are being increasingly used in the industry in order to automate different tasks and offer services to a wider audience. The REST architectural style aims at producing scalable and extensible web services using technologies that play well with the existing tools and infrastructure of the web. It provides a uniform set of operation that can be used to invoke a CRUD interface (create, retrieve, update and delete) of a web service. The stateless behavior of the service interface requires that every request to a resource is independent of the previous ones facilitating scalability. Automated systems, e.g., hotel reservation systems, provide advanced scenarios for stateful services that require a certain sequence of requests that must be followed in order to fulfill the service goals. Designing and developing such services for advanced scenarios with REST constraints require rigorous approaches that are capable of creating web services that can be trusted for their behavior. Systems that can be trusted for their behavior can be termed as dependable systems. This thesis presents an integrated design, analysis and validation approach that facilitates the service developer to create dependable and stateful REST web services. The main contribution of this thesis is that we provide a novel model-driven methodology to design behavioral REST web service interfaces and their compositions. The behavioral interfaces provide information on what methods can be invoked on a service and the pre- and post-conditions of these methods. The methodology uses Unified Modeling Language (UML), as the modeling language, which has a wide user base and has mature tools that are continuously evolving. We have used UML class diagram and UML state machine diagram with additional design constraints to provide resource and behavioral models, respectively, for designing REST web service interfaces. These service design models serve as a specification document and the information presented in them have manifold applications. The service design models also contain information about the time and domain requirements of the service that can help in requirement traceability which is an important part of our approach. Requirement traceability helps in capturing faults in the design models and other elements of software development environment by tracing back and forth the unfulfilled requirements of the service. The information about service actors is also included in the design models which is required for authenticating the service requests by authorized actors since not all types of users have access to all the resources. In addition, following our design approach, the service developer can ensure that the designed web service interfaces will be REST compliant. The second contribution of this thesis is consistency analysis of the behavioral REST interfaces. To overcome the inconsistency problem and design errors in our service models, we have used semantic technologies. The REST interfaces are represented in web ontology language, OWL2, that can be part of the semantic web. These interfaces are used with OWL 2 reasoners to check unsatisfiable concepts which result in implementations that fail. This work is fully automated thanks to the implemented translation tool and the existing OWL 2 reasoners. The third contribution of this thesis is the verification and validation of REST web services. We have used model checking techniques with UPPAAL model checker for this purpose. The timed automata of UML based service design models are generated with our transformation tool that are verified for their basic characteristics like deadlock freedom, liveness, reachability and safety. The implementation of a web service is tested using a black-box testing approach. Test cases are generated from the UPPAAL timed automata and using the online testing tool, UPPAAL TRON, the service implementation is validated at runtime against its specifications. Requirement traceability is also addressed in our validation approach with which we can see what service goals are met and trace back the unfulfilled service goals to detect the faults in the design models. A final contribution of the thesis is an implementation of behavioral REST interfaces and service monitors from the service design models. The partial code generation tool creates code skeletons of REST web services with method pre and post-conditions. The preconditions of methods constrain the user to invoke the stateful REST service under the right conditions and the post condition constraint the service developer to implement the right functionality. The details of the methods can be manually inserted by the developer as required. We do not target complete automation because we focus only on the interface aspects of the web service. The applicability of the approach is demonstrated with a pedagogical example of a hotel room booking service and a relatively complex worked example of holiday booking service taken from the industrial context. The former example presents a simple explanation of the approach and the later worked example shows how stateful and timed web services offering complex scenarios and involving other web services can be constructed using our approach.
Resumo:
Objective of the study The aim of this study is to understand the institutional implications in Abenomics in a spatial context, the contemporary economic reform taking place in Japan, which is to finally end over two decades of economic malaise. For theoretical perspective of choice, this study explores a synthesis of institutionalism as the main approach, complemented by economies of agglomeration in spatial economics, or New Economic Geography (NEG). The outcomes include a narrative with implications for future research, as well as possible future implications for the economy of Japan, itself. The narrative seeks to depict the dialogue between public discourse and governmental communication in order to create a picture of how this phenomenon is being socially constructed. This is done by studying the official communications by the Cabinet along with public media commentary on respective topics. The reform is studied with reference to historical socio-cultural, economic evolution of Japan, which in turn, is explored through a literature review. This is to assess the unique institutional characteristics of Japan pertinent to reform. Research method This is a social and exploratory qualitative study – an institutional narrative case study. The methodological approach was kept practical: in addition to literature review, a narrative, thematic content analysis with structural emphasis was used to construct the contemporary narrative based on the Cabinet communication. This was combined with practical analytic tools borrowed from critical discourse analysis, which were utilized to assess the implicit intertextual agenda within sources. Findings What appears to characterize the discourse is status quo bias that comes in multiple forms. The bias is also coded in the institutions surrounding the reform, wherein stakeholders have vested interests in protecting the current state of affairs. This correlates with uncertainty avoidance characteristic to Japan. Japan heeds the international criticism to deregulate on a rhetorical level, but consistent with history, the Cabinet solutions appear increasingly bureaucratic. Hence, the imposed western information-age paradigm of liberal cluster agglomeration seems ill-suited to Japan which lacks risk takers and a felicitous entrepreneur culture. The Japanese, however, possess vast innovative potential ascribed to some institutional practices and traits, but restrained by others. The derived conclusion is to study the successful intrapreneur cases in Japanese institutional setting as a potential benchmark for Japan specific cluster agglomeration, and a solution to its structural problems impeding growth.
Resumo:
Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
How is the corneal epithelium restored when all of it plus the limbus have been eliminated? This investigation explored the possibility that this may be achieved through the conjunctival epithelium. The corneal epithelium of the right eye of 12 rabbits (Oryctolagus cuniculus) was totally scraped followed by surgical excision of the limbus plus 1.0-1.5 mm of the adjacent conjunctiva. Antibiotics and corticosteroids were applied for 1 week after surgery. Histological and immunohistochemical techniques were used to monitor the events taking place on the eye surface 2 weeks and 1, 3 and 6 months thereafter. Initially, the corneal surface was covered by conjunctival-like epithelium. After 1 month and more prominently at 3 and 6 months an epithelium displaying the morphological features of the cornea and reacting with the AE5 antibody was covering the central region. It is likely that the corneal epithelium originated from undifferentiated cells of the conjunctiva interacting with the corneal stroma.
Resumo:
Lipids were extracted from Chlorella algae with supercritical hexane. The high lipids yield of approximately 10% was obtained at optimum conditions of 300 rpm stirring speed and 2 h duration compared to the total contents of lipids being 12%. Furthermore, an easiness of hexane recovery may be considered as economically and ecologically attractive. For the first time, in the current work catalytic hydrodeoxygenation (HDO) of Chlorella algal lipids was studied over 5 wt% Ni/H-Y-80 and 5 wt% Ni/SiO2 at 300 C and under 30 bar total pressure in H2. A comparative HDO of stearic acid was carried out under similar conditions. The conversion of lipids was about 35% over 5 wt% Ni/H-Y-80 after 6h, whereas, 5 wt% Ni/SiO2 was totally deactivated after 60 min. The selectivity to hydrocarbons (C15-C18) is 6%. As a comparison, complete conversion of stearic acid over 5 wt% Ni/H-Y-80 was achieved in 6 h. The transformation of lipids proceeded mostly via hydrogenation and hydrolysis with formation of free fatty acid (FFA). The lower activity might be attributed to deactivation of catalysts caused by chlorophylls and carotenoids. Even though the conversion is low, future studies in HDO of lipids extracted from other algae species having higher lipid content could be proposed. Coke resistant catalyst might be considered to improve catalytic activity.