969 resultados para single-molecule analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Human neuronal protein (hNP22) is a gene with elevated messenger RNA expression in the prefrontal cortex of the human alcoholic brain. hNP22 has high homology with a rat protein (rNP22). These proteins also share homology with a number of cytoskeleton-interacting proteins. Methods: A rabbit polyclonal antibody to an 18-amino acid epitope was produced for use in Western and immunohistochemical analysis. Samples from the human frontal and motor cortices were used for Western blots (n = 10), whereas a different group of frontal cortex and hippocampal samples were obtained for immunohistochemistry (n = 12). Results: The hNP22 antibody detected a single protein in both rat and human brain. Western blots revealed a significant increase in hNP22 protein levels in the frontal cortex but not the motor cortex of alcoholic cases. Immunohistochemical studies confirmed the increased hNP22 protein expression in all cortical layers. This is consistent with results previously obtained using Northern analysis. Immunohistochemical analysis also revealed a significant increase of hNP22 immunoreactivity in the CA3 and CA4 but not other regions of the hippocampus. Conclusions: It is possible that this protein may play a role in the morphological or plastic changes observed after chronic alcohol exposure and withdrawal, either as a cytoskeleton-interacting protein or as a signaling molecule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-temperature (15 K) single-crystal neutron-diffraction structures and Raman spectra of the salts (NX4)(2)[CU(OX2)(6)](SO4)(2), where X = H or D, are reported. This study is concerned with the origin of the structural phase change that is known to occur upon deuteration. Data for the deuterated salt were measured in the metastable state, achieved by application of 500 bar of hydrostatic pressure at similar to303 K followed by cooling to 281 K and the subsequent release of pressure. This allows for the direct comparison between the hydrogenous and deuterated salts, in the same modification, at ambient pressure and low temperature. The Raman spectra provide no intimation of any significant change in the intermolecular bonding. Furthermore, structural differences are few, the largest being for the long Cu-O bond, which is 2.2834(5) and 2.2802(4) Angstrom for the hydrogenous and the deuterated salts, respectively. Calorimetric data for the deuterated salt are also presented, providing an estimate of 0.17(2) kJ/mol for the enthalpy difference between the two structural forms at 295.8(5) K. The structural data suggest that substitution of hydrogen for deuterium gives rise to changes in the hydrogen-bonding interactions that result in a slightly reduced force field about the copper(II) center. The small structural differences suggest different relative stabilities for the hydrogenous and deuterated salts, which may be sufficient to stabilize the hydrogenous salt in the anomalous structural form.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this article is to present a defense of the use of single case studies in management research. The defense is necessary because this type of research has been relegated to a secondary role, or even rejected, by many researchers, who consider it unscientific. Evidence of this low status is the fact that most reputable academic journals in management publish few articles based on single-case studies. In this paper, we examine in detail the objections to the use of such cases in management research. We show the efforts made by some researchers to answer these objections and we show quality criteria for research that are alternatives to the criteria used in the so-called "scientific method." Our analysis suggests that a better understanding - by researchers with different methodological preferences - of the arguments for each particular use of the single-case study as a research method would allow a better dialogue between researchers and benefit management research as a whole.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current study focuses on the analysis of pressure surge damping in single pipeline systems generated by a fast change of flow, conditions. A dimensionless form of pressurised transient flow equations was developed. presenting the main advantage of being independent of the system characteristics. In lack of flow velocity profiles. the unsteady friction in turbulent regimes is analysed based on two new empirical corrective-coefficients associated with local and convective acceleration terms. A new, surge damping approach is also presented taking into account the pressure peak time variation. The observed attenuation effect in the pressure wave for high deformable pipe materials can be described by a combination of the non-elastic behaviour of the pipe-wall with steady and unsteady friction effects. Several simulations and experimental tests have been carried out. in order to analyse the dynamic response of single pipelines with different characteristics, such as pipe materials. diameters. thickness. lengths and transient conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was carried out with the aim of modeling in 2D, in plain strain, the movement of a soft cohesive soil around a pile, in order to enable the determination of stresses resulting along the pile, per unit length. The problem in study fits into the large deformations problem and can be due to landslide, be close of depth excavations, to be near of zones where big loads are applied in the soil, etc. In this study is used an constitutive Elasto-Plastic model with the failure criterion of Mohr-Coulomb to model the soil behavior. The analysis is developed considering the soil in undrained conditions. To the modeling is used the finite element program PLAXIS, which use the Updated Lagrangian - Finite Element Method (UL-FEM). In this work, special attention is given to the soil-pile interaction, where is presented with some detail the formulation of the interface elements and some studies for a better understand of his behavior. It is developed a 2-D model that simulates the effect of depth allowing the study of his influence in the stress distribution around the pile. The results obtained give an important base about how behaves the movement of the soil around a pile, about how work the finite element program PLAXIS and how is the stress distribution around the pile. The analysis demonstrate that the soil-structure interaction modeled with the UL-FEM and interface elements is more appropriate to small deformations problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most effective ways of controlling vibrations in plate or beam structures is by means of constrained viscoelastic damping treatments. Contrary to the unconstrained configuration, the design of constrained and integrated layer damping treatments is multifaceted because the thickness of the viscoelastic layer acts distinctly on the two main counterparts of the strain energy the volume of viscoelastic material and the shear strain field. In this work, a parametric study is performed exploring the effect that the design parameters, namely the thickness/length ratio, constraining layer thickness, material modulus, natural mode and boundary conditions have on these two counterparts and subsequently, on the treatment efficiency. This paper presents five parametric studies, namely, the thickness/length ratio, the constraining layer thickness, material properties, natural mode and boundary conditions. The results obtained evidence an interesting effect when dealing with very thin viscoelastic layers that contradicts the standard treatment efficiency vs. layer thickness relation; hence, the potential optimisation of constrained and integrated viscoelastic treatments through the use of properly designed thin multilayer configurations is justified. This work presents a dimensionless analysis and provides useful general guidelines for the efficient design of constrained and integrated damping treatments based on single or multi-layer configurations. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a new method for single-drop microextraction (SDME) for the preconcentration of organochlorine pesticides (OCP) from complex matrices. It is based on the use of a silicone ring at the tip of the syringe. A 5 μL drop of n-hexane is applied to an aqueous extract containing the OCP and found to be adequate to preconcentrate the OCPs prior to analysis by GC in combination with tandem mass spectrometry. Fourteen OCP were determined using this technique in combination with programmable temperature vaporization. It is shown to have many advantages over traditional split/splitless injection. The effects of kind of organic solvent, exposure time, agitation and organic drop volume were optimized. Relative recoveries range from 59 to 117 %, with repeatabilities of <15 % (coefficient of variation) were achieved. The limits of detection range from 0.002 to 0.150 μg kg−1. The method was applied to the preconcentration of OCPs in fresh strawberry, strawberry jam, and soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The species abundance distribution (SAD) has been a central focus of community ecology for over fifty years, and is currently the subject of widespread renewed interest. The gambin model has recently been proposed as a model that provides a superior fit to commonly preferred SAD models. It has also been argued that the model's single parameter (α) presents a potentially informative ecological diversity metric, because it summarises the shape of the SAD in a single number. Despite this potential, few empirical tests of the model have been undertaken, perhaps because the necessary methods and software for fitting the model have not existed. Here, we derive a maximum likelihood method to fit the model, and use it to undertake a comprehensive comparative analysis of the fit of the gambin model. The functions and computational code to fit the model are incorporated in a newly developed free-to-download R package (gambin). We test the gambin model using a variety of datasets and compare the fit of the gambin model to fits obtained using the Poisson lognormal, logseries and zero-sum multinomial distributions. We found that gambin almost universally provided a better fit to the data and that the fit was consistent for a variety of sample grain sizes. We demonstrate how α can be used to differentiate intelligibly between community structures of Azorean arthropods sampled in different land use types. We conclude that gambin presents a flexible model capable of fitting a wide variety of observed SAD data, while providing a useful index of SAD form in its single fitted parameter. As such, gambin has wide potential applicability in the study of SADs, and ecology more generally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electroactivity of butylate (BTL) is studied by cyclic voltammetry (CV) and square wave voltammetry (SWV) at a glassy carbon electrode (GCE) and a hanging mercury drop electrode (HMDE). Britton–Robinson buffer solutions of pH 1.9–11.5 are used as supporting electrolyte. CV voltammograms using GCE show a single anodic peak regarding the oxidation of BTL at +1.7V versus AgCl/ Ag, an irreversible process controlled by diffusion. Using a HMDE, a single cathodic peak is observed, at 1.0V versus AgCl/Ag. The reduction of BTL is irreversible and controlled by adsorption. Mechanism proposals are presented for these redox transformations. Optimisation is carried out univaryingly. Linearity ranges were 0.10–0.50 mmol L-1 and 2.0–9.0 µmolL-1 for anodic and cathodic peaks, respectively. The proposed method is applied to the determination of BTL in waters. Analytical results compare well with those obtained by an HPLC method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In life cycle impact assessment (LCIA) models, the sorption of the ionic fraction of dissociating organic chemicals is not adequately modeled because conventional non-polar partitioning models are applied. Therefore, high uncertainties are expected when modeling the mobility, as well as the bioavailability for uptake by exposed biota and degradation, of dissociating organic chemicals. Alternative regressions that account for the ionized fraction of a molecule to estimate fate parameters were applied to the USEtox model. The most sensitive model parameters in the estimation of ecotoxicological characterization factors (CFs) of micropollutants were evaluated by Monte Carlo analysis in both the default USEtox model and the alternative approach. Negligible differences of CFs values and 95% confidence limits between the two approaches were estimated for direct emissions to the freshwater compartment; however the default USEtox model overestimates CFs and the 95% confidence limits of basic compounds up to three orders and four orders of magnitude, respectively, relatively to the alternative approach for emissions to the agricultural soil compartment. For three emission scenarios, LCIA results show that the default USEtox model overestimates freshwater ecotoxicity impacts for the emission scenarios to agricultural soil by one order of magnitude, and larger confidence limits were estimated, relatively to the alternative approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we survey the most relevant results for the prioritybased schedulability analysis of real-time tasks, both for the fixed and dynamic priority assignment schemes. We give emphasis to the worst-case response time analysis in non-preemptive contexts, which is fundamental for the communication schedulability analysis. We define an architecture to support priority-based scheduling of messages at the application process level of a specific fieldbus communication network, the PROFIBUS. The proposed architecture improves the worst-case messages’ response time, overcoming the limitation of the first-come-first-served (FCFS) PROFIBUS queue implementations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LLF (Least Laxity First) scheduling, which assigns a higher priority to a task with a smaller laxity, has been known as an optimal preemptive scheduling algorithm on a single processor platform. However, little work has been made to illuminate its characteristics upon multiprocessor platforms. In this paper, we identify the dynamics of laxity from the system’s viewpoint and translate the dynamics into LLF multiprocessor schedulability analysis. More specifically, we first characterize laxity properties under LLF scheduling, focusing on laxity dynamics associated with a deadline miss. These laxity dynamics describe a lower bound, which leads to the deadline miss, on the number of tasks of certain laxity values at certain time instants. This lower bound is significant because it represents invariants for highly dynamic system parameters (laxity values). Since the laxity of a task is dependent of the amount of interference of higher-priority tasks, we can then derive a set of conditions to check whether a given task system can go into the laxity dynamics towards a deadline miss. This way, to the author’s best knowledge, we propose the first LLF multiprocessor schedulability test based on its own laxity properties. We also develop an improved schedulability test that exploits slack values. We mathematically prove that the proposed LLF tests dominate the state-of-the-art EDZL tests. We also present simulation results to evaluate schedulability performance of both the original and improved LLF tests in a quantitative manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Graphics processors were originally developed for rendering graphics but have recently evolved towards being an architecture for general-purpose computations. They are also expected to become important parts of embedded systems hardware -- not just for graphics. However, this necessitates the development of appropriate timing analysis techniques which would be required because techniques developed for CPU scheduling are not applicable. The reason is that we are not interested in how long it takes for any given GPU thread to complete, but rather how long it takes for all of them to complete. We therefore develop a simple method for finding an upper bound on the makespan of a group of GPU threads executing the same program and competing for the resources of a single streaming multiprocessor (whose architecture is based on NVIDIA Fermi, with some simplifying assunptions). We then build upon this method to formulate the derivation of the exact worst-case makespan (and corresponding schedule) as an optimization problem. Addressing the issue of tractability, we also present a technique for efficiently computing a safe estimate of the worstcase makespan with minimal pessimism, which may be used when finding an exact value would take too long.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The scope of this paper is to adapt the standard mean-variance model of Henry Markowitz theory, creating a simulation tool to find the optimal configuration of the portfolio aggregator, calculate its profitability and risk. Currently, there is a deep discussion going on among the power system society about the structure and architecture of the future electric system. In this environment, policy makers and electric utilities find new approaches to access the electricity market; this configures new challenging positions in order to find innovative strategies and methodologies. Decentralized power generation is gaining relevance in liberalized markets, and small and medium size electricity consumers are also become producers (“prosumers”). In this scenario an electric aggregator is an entity that joins a group of electric clients, customers, producers, “prosumers” together as a single purchasing unit to negotiate the purchase and sale of electricity. The aggregator conducts research on electricity prices, contract terms and conditions in order to promote better energy prices for their clients and allows small and medium customers to benefit improved market prices.