332 resultados para toolbox


Relevância:

10.00% 10.00%

Publicador:

Resumo:

O Estado, em seus âmbitos econômico, político e social, tem papel decisivo na formulação das ciências. A ciência, que buscava explicar os fenômenos naturais, desenvolveu-se e desdobrou-se em diversas áreas e campos, buscando responder às complexas questões que fazem parte do mundo moderno. A saúde se coloca enquanto um desses campos complexos, que inicialmente compreendia a história das doenças e das condições de vida e teve que ser questionada à medida que somente essa teoria não mais justificava as complexas existências e modos de andar a vida. Especificamente a Saúde Coletiva no Brasil reinventou formas de responder aos inúmeros e complexos questionamentos que se colocam no âmbito da vida e das condições de vida. Assim, buscou-se explorar a trajetória históricopolítica-conceitual da constituição do campo da Saúde Coletiva no Brasil apoiado em uma metodologia que se utiliza de elementos analíticos da própria reflexão que o estudo traz, em um movimento de investigação denominado como entre-meios. São apresentadas as falas dos participantes, abordando episódios e reflexões sobre os acontecimentos que marcaram a história da Saúde Coletiva em nosso país, o que deu base para compor uma caixa de ferramenta para o desenvolvimento do estudo. Os diferentes significados da Saúde Coletiva foram apresentados a partir do material empírico, bem como de uma análise considerando outros olhares sobre o mesmo objeto, pela qual se buscou construir um olhar autoral sobre o objeto estudado. Além disso, após a busca de conceitos e teorias sobre os campos, foram apresentadas diferentes abordagens para a conceituação de campo, sendo que a caixa de ferramentas e as análises dos significantes antes expostos foram utilizados para construir algumas considerações e questões. Desenvolveu-se, através das bases de dados empíricos e teórico conceituais, uma análise sobre a Saúde Coletiva no Brasil para compreender o campo a partir de um olhar crítico sobre a cientifização das áreas de conhecimento. Considerando a singularidade de um campo ainda em transformação que se constituiu em um cenário político particular, onde a Reforma Sanitária Brasileira estava em construção compreendesse sua conformação enquanto um campo de saberes e práticas militantes, para a construção de novos paradigmas para explicar e intervir na saúde do povo brasileiro.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-chain technology (SCT) allows the transformation of individual polymer chains to folded/collapsed unimolecular soft nanoparticles. In this work we contribute to the enlargement of the SCT toolbox by demonstrating the efficient synthesis of single-chain polymer nanoparticles (SCNPs) via intrachain amide formation. In particular, we exploit cross-linking between active methylene groups and isocyanate moieties as powerful "click" chemistry driving force for SCNP construction. By employing poly(methyl methacrylate)- (PMMA-) based copolymers bearing beta-ketoester units distributed randomly along the copolymer chains and bifunctional isocyanate cross-linkers, SCNPs were successfully synthesized at r.t. under appropriate reaction conditions. Characterization of the resulting SCNPs was carried out by means of a combination of techniques including size exclusion chromatography (SEC), infrared (IR) spectroscopy, proton nuclear magnetic resonance (H-1 NMR) spectroscopy, dynamic light scattering (DLS), and elemental analysis (EA).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The PiP model is a software application with a user-friendly interface for calculating vibration from underground railways. This paper reports about the software with a focus on its latest version and the plans for future developments. The software calculates the Power Spectral Density of vibration due to a moving train on floating-slab track with track irregularity described by typical values of spectra for tracks with good, average and bad conditions. The latest version accounts for a tunnel embedded in a half space by employing a toolbox developed at K.U. Leuven which calculates Green's functions for a multi-layered half-space. © 2009 IOP Publishing Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Much effort has focussed in recent years on probing the interactions of small molecules with amyloid fibrils and other protein aggregates. Understanding and control of such interactions are important for the development of diagnostic and therapeutic strategies in situations where protein aggregation is associated with disease. In this perspective article we give an overview over the toolbox of biophysical methods for the study of such amyloid-small molecule interactions. We discuss in detail two recently developed techniques within this framework: linear dichroism, a promising extension of the more traditional spectroscopic techniques, and biosensing methods, where surface-bound amyloid fibrils are exposed to solutions of small molecules. Both techniques rely on the measurement of physical properties that are very directly linked to the binding of small molecules to amyloid aggregates and therefore provide an attractive route to probe these important interactions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Variational methods are a key component of the approximate inference and learning toolbox. These methods fill an important middle ground, retaining distributional information about uncertainty in latent variables, unlike maximum a posteriori methods (MAP), and yet generally requiring less computational time than Monte Carlo Markov Chain methods. In particular the variational Expectation Maximisation (vEM) and variational Bayes algorithms, both involving variational optimisation of a free-energy, are widely used in time-series modelling. Here, we investigate the success of vEM in simple probabilistic time-series models. First we consider the inference step of vEM, and show that a consequence of the well-known compactness property of variational inference is a failure to propagate uncertainty in time, thus limiting the usefulness of the retained distributional information. In particular, the uncertainty may appear to be smallest precisely when the approximation is poorest. Second, we consider parameter learning and analytically reveal systematic biases in the parameters found by vEM. Surprisingly, simpler variational approximations (such a mean-field) can lead to less bias than more complicated structured approximations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

描述了一水下机器人——机械手系统研究平台的搭建,详细介绍了三功能水下电动机械手的设计与实验,给出了载体分系统的设计结果,利用Matlab工具箱和M函数构建了系统仿真模型,可以有效地对系统规划和控制算法进行验证(包括分别对载体分系统和机械手分系统的控制),可为进一步的现场试验提供指导和方法验证。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

由于自治水下机器人技术的复杂性 ,系统仿真技术变得越来越重要。系统地分析了自治水下机器人 (AUV ,AutonomousUnderwaterVehicle)的运动模型和空间运动方程 ,运用MATLAB下的SIMULINK ,设计了自治水下机器人的全自由度仿真工具箱 ,包括机器人本体运动、位姿求解和坐标系转换等多个部分 ,可以方便地进行控制方法的全自由度的仿真。

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anthropogenic pollutant chemicals pose a major threat to aquatic organisms. There is a need for more research on emerging categories of environmental chemicals such as nanomaterials, endocrine disruptors and pharmaceuticals. Proteomics offers options and advantages for early warning of alterations in environmental quality by detecting sub-lethal changes in sentinel species such as the mussel, Mytilus edulis. This thesis aimed to compare the potential of traditional biomarkers (such as enzyme activity measurement) and newer redox proteomic approaches. Environmental proteomics, especially a redox proteomics toolbox, may be a novel way to study pollutant effects on organisms which can also yield information on risks to human health. In particular, it can probe subtle biochemical changes at sub-lethal concentrations and thus offer novel insights to toxicity mechanisms. In the first instance, the present research involved a field-study in three stations in Cork Harbour, Ireland (Haulbowline, Ringaskiddy and Douglas) compared to an outharbour control site in Bantry Bay, Ireland. Then, further research was carried out to detect effects of anthropogenic pollution on selected chemicals. Diclofenac is an example of veterinary and human pharmaceuticals, an emerging category of chemical pollutants, with potential to cause serious toxicity to non-target organisms. A second chemical used for this study was copper which is a key source of contamination in marine ecosystems. Thirdly, bisphenol A is a major anthropogenic chemical mainly used in polycarbonate plastics manufacturing that is widespread in the environment. It is also suspected to be an endocrine disruptor. Effects on the gill, the principal feeding organ of mussels, were investigated in particular. Effects on digestive gland were also investigated to compare different outcomes from each tissue. Across the three anthropogenic chemicals studied (diclofenac, copper and bisphenol A), only diclofenac exposure did not show any significant difference towards glutathione transferase (GST) responses. Meanwhile, copper and bisphenol A significantly increased GST in gill. Glutathione reductase (GR) enzyme analysis revealed that all three chemicals have significant responses in gill. Catalase activity showed significant differences in digestive gland exposed to diclofenac and gills exposed to bisphenol A. This study focused then on application of redox proteomics; the study of the oxidative modification of proteins, to M. edulis. Thiol proteins were labelled with 5-iodoacetamidofluorescein prior to one-dimensional and two-dimensional electrophoresis. This clearly revealed some similarities on a portion of the redox proteome across chemical exposures indicating where toxicity mechanism may be common and where effects are unique to a single treatment. This thesis documents that proteomics is a robust tool to provide valuable insights into possible mechanisms of toxicity of anthropogenic contaminants in M. edulis. It is concluded that future research should focus on gill tissue, on protein thiols and on key individual proteins discovered in this study such as calreticulin and arginine kinase which have not previously been considered as biomarkers in aquatic toxicology prior to this study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Logic-based models are thriving within artificial intelligence. A great number of new logics have been defined, and their theory investigated. Epistemic logics introduce modal operators for knowledge or belief; deontic logics are about norms, and introduce operators of deontic necessity and possibility (i.e., obligation or prohibition). And then we have a much investigated class—temporal logics—to whose application to engineering this special issue is devoted. This kind of formalism deserves increased widespread recognition and application in engineering, a domain where other kinds of temporal models (e.g., Petri nets) are by now a fairly standard part of the modelling toolbox.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Remote sensing airborne hyperspectral data are routinely used for applications including algorithm development for satellite sensors, environmental monitoring and atmospheric studies. Single flight lines of airborne hyperspectral data are often in the region of tens of gigabytes in size. This means that a single aircraft can collect terabytes of remotely sensed hyperspectral data during a single year. Before these data can be used for scientific analyses, they need to be radiometrically calibrated, synchronised with the aircraft's position and attitude and then geocorrected. To enable efficient processing of these large datasets the UK Airborne Research and Survey Facility has recently developed a software suite, the Airborne Processing Library (APL), for processing airborne hyperspectral data acquired from the Specim AISA Eagle and Hawk instruments. The APL toolbox allows users to radiometrically calibrate, geocorrect, reproject and resample airborne data. Each stage of the toolbox outputs data in the common Band Interleaved Lines (BILs) format, which allows its integration with other standard remote sensing software packages. APL was developed to be user-friendly and suitable for use on a workstation PC as well as for the automated processing of the facility; to this end APL can be used under both Windows and Linux environments on a single desktop machine or through a Grid engine. A graphical user interface also exists. In this paper we describe the Airborne Processing Library software, its algorithms and approach. We present example results from using APL with an AISA Eagle sensor and we assess its spatial accuracy using data from multiple flight lines collected during a campaign in 2008 together with in situ surveyed ground control points.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with an experimental investigation into the velocity distribution downstream of a propeller, operating at bollard pull conditions and in the presence of a mobile sediment bed. Previous investigations either ignored the effect of a rudder in the wash or considered only its influence on an unconfined jet. The velocity profiles within the jet produced by a rotating propeller with a rudder present were measured at a mobile bed and compared to currently available predictive equations. The velocity distribution profiles in the jet, influenced by bed proximity, were found not to comply with current predictive methods. The velocity distributions measured within the jet were found to be complex and non-symmetrical. To provide a basic velocity predictive tool, a neural network analysis toolbox within Matlab was utilised and trained using the experimental data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Objective: The purpose of this study was to show the association between changes in clinician self-efficacy and readiness to change and implementation of an asthma management program (Easy Breathing). Methods: A 36 month randomized, controlled trial was conducted involving 24 pediatric practices (88 clinicians). Randomized clinicians received interventions designed to enhance clinician self-efficacy and readiness to change which were measured at baseline and 3 years. Interventions consisted of an educational toolbox, seminars, teleconferences, mini-fellowships, opinion leader visits, clinician-specific feedback, and pay for performance. The primary outcome was program utilization (number of children enrolled in Easy Breathing/year); secondary outcomes included development of a written treatment plan and severity-appropriate therapy. Results: At baseline, clinicians enrolled 149 ± 147 (mean ± SD) children/clinician/year; 84% of children had a written treatment plan and 77% of plans used severity-appropriate therapy. At baseline, higher self-efficacy scores were associated with greater program utilization (relative rate [RR], 1.34; 95% confidence interval [CI], 1.04-1.72; P =.04) but not treatment plan development (RR, 0.63; 95% CI, 0.29-1.35; P =.23) or anti-inflammatory use (RR, 1.76; 95% CI, 0.92-3.35; P =.09). Intervention clinicians participated in 17 interventions over 36 months. At study end, self-efficacy scores increased in intervention clinicians compared to control clinicians (P =.01) and more clinicians were in an action stage of change (P =.001) but these changes were not associated with changes in primary or secondary outcomes. Conclusions: Self-efficacy scores correlated with program use at baseline and increased in the intervention arm, but these increases were not associated with greater program-related activities. Self-efficacy may be necessary but not sufficient for behavior change. Copyright © 2012 by Academic Pediatric Association.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Richardson-Lucy algorithm is one of the most important algorithms in the image deconvolution area. However, one of its drawbacks is slow convergence. A very significant acceleration is obtained by the technique proposed by Biggs and Andrews (BA), which is implemented in the deconvlucy function of the Image Processing MATLAB toolbox. The BA method was developed heuristically with no proof of convergence. In this paper, we introduce the Heavy-Ball (H-B) method for Poisson data optimization and extend it to a scaled H-B method, which includes the BA method as a special case. The method has proof of the convergence rate of O(k-2), where k is the number of iterations. We demonstrate the superior convergence performance of the scaled H-B method on both synthetic and real 3D images.