943 resultados para costly taxation
Immunological determination of the pharmaceutical diclofenac in environmental and biological samples
Resumo:
A highly sensitive and specific competitive ELISA on 96-microwell plates was developed for the analysis of the nonsteroidal anti-inflammatory drug diclofenac. Within the water cycle in Europe, this is one of the most frequently detected pharmaceutically active compounds. The LOD at a signal-tonoise ratio (S/N) of 3, and the IC 50, were found to be 6 ng/L and 60 ng/L respectively in tap water. In a comparative study using ELISA and GC-MS, diclofenac levels in wastewater from 21 sewage treatment plants were determined and a good correlation between these methods was found (ELISA vs. GCMS: r = 0.70, slope = 0,90, intercept = 0.37, n = 24). An average degradation rate of -25% can be calculated. Labscale-experiments on the elimination of diclofenac in continuous pilot sewage plants revealed a removal rate of only 5% over a period of 13 weeks. In a further study, the ELISA was applied to a number of extracts of various animal tissues from a range of species, and again a very good relationship between ELISA and LC-ESI/MS data sets was obtained (r = 0.90, p<0.0001; n = 117). The ELISA has proven to be a simple, rapid, reliable and affordable alternative to otherwise costly and advanced techniques for the detection of diclofenac in matrix diverse water samples and tissue extracts after only relatively simple sample preparation. © 2007 American Chemical Society.
Resumo:
There is an increasing interest in the biomedical field to create implantable medical devices to provide a temporary mechanical function for use inside the human body. In many of these applications bioresorbable polymer composites using PLLA with β-TCP , are increasingly being used due to their biocompatability, biodegradability and mechanical strength.1,3 These medical devices can be manufactured using conventional plastics processing methods such as injection moulding and extrusion, however there is great need to understand and control the process due to a lack of knowledge on the influence of processing on material properties. With the addition of biocompatible additives there is also a requirement to be able to predict the quality and level of dispersion within the polymer matrix. On-line UV-Vis spectroscopy has been shown to monitor the quality of fillers in polymers. This can eliminate time consuming and costly post-process evaluation of additive dispersion. The aim of this work was to identify process and performance relationships of PLLA/β-TCP composites with respect to melt-extrusion conditions. This is part of a wider study into on-line process monitoring of bioresorbable polymers as used in the medical industry.
These results show that final properties of the PLLA/ β-TCP composite are highly influenced by the particle size and loading. UV-Vis spectroscopy can be used on-line to monitor the final product and this can be utilised as a valuable tool for quality control in an application where consistent performance is of paramount importance.
Resumo:
Reinforced concrete members are extremely complex under loading because of localised deformations in the concrete (cracks, sliding planes) and between the reinforcement and concrete (slip). An ideal model for simulating behaviour of reinforced concrete members should incorporate both global behaviour and the localised behaviours that are seen and measured in practice; these localised behaviours directly affect the global behaviour. Most commonly used models do not directly simulate these localised behaviours that can be seen or measured in real members; instead, they overcome these limitations by using empirically or semi-empirically derived strain-based pseudo properties such as the use of effective flexural rigidities for deflection; plastic hinge lengths for strength and ductility; and energy-based approaches for both concrete softening in compression and concrete softening after tensile cracking to allow for tension stiffening. Most reinforced concrete member experimental testing is associated with deriving these pseudo properties for use in design and analysis, and this component of development is thus costly. The aim of the present research is to reduce this cost substantially. In this paper, localised material behaviours and the mechanisms they induce are described. Their incorporation into reinforced concrete member behaviour without the need for empirically derived pseudo properties is described in a companion paper.
Resumo:
This study provides estimates of the macroeconomic impact of non-communicable diseases (NCDs) inChina and India for the period 2012–2030. Our estimates are derived using the World Health Organization’sEPIC model of economic growth, which focuses on the negative effects of NCDs on labor supply andcapital accumulation. We present results for the five main NCDs (cardiovascular disease, cancer, chronicrespiratory disease, diabetes, and mental health). Our undiscounted estimates indicate that the cost ofthe five main NCDs will total USD 23.03 trillion for China and USD 4.58 trillion for India (in 2010 USD).For both countries, the most costly domain is cardiovascular disease. Our analyses also reveal that thecosts are much larger in China than in India mainly because of China’s higher and steeper income trajectory,and to a lesser extent its older population. Rough calculations also indicate that WHO’s best buys foraddressing the challenge of NCDs are highly cost-beneficial
Resumo:
BACKGROUND: While the discovery of new drugs is a complex, lengthy and costly process, identifying new uses for existing drugs is a cost-effective approach to therapeutic discovery. Connectivity mapping integrates gene expression profiling with advanced algorithms to connect genes, diseases and small molecule compounds and has been applied in a large number of studies to identify potential drugs, particularly to facilitate drug repurposing. Colorectal cancer (CRC) is a commonly diagnosed cancer with high mortality rates, presenting a worldwide health problem. With the advancement of high throughput omics technologies, a number of large scale gene expression profiling studies have been conducted on CRCs, providing multiple datasets in gene expression data repositories. In this work, we systematically apply gene expression connectivity mapping to multiple CRC datasets to identify candidate therapeutics to this disease.
RESULTS: We developed a robust method to compile a combined gene signature for colorectal cancer across multiple datasets. Connectivity mapping analysis with this signature of 148 genes identified 10 candidate compounds, including irinotecan and etoposide, which are chemotherapy drugs currently used to treat CRCs. These results indicate that we have discovered high quality connections between the CRC disease state and the candidate compounds, and that the gene signature we created may be used as a potential therapeutic target in treating the disease. The method we proposed is highly effective in generating quality gene signature through multiple datasets; the publication of the combined CRC gene signature and the list of candidate compounds from this work will benefit both cancer and systems biology research communities for further development and investigations.
Resumo:
A conventional way to identify bridge frequencies is utilizing vibration data measured directly from the bridge. A drawback with this approach is that the deployment and maintenance of the vibration sensors are generally costly and time-consuming. One way to cope with the drawback is an indirect approach utilizing vehicle vibrations while the vehicle passes over the bridge. In the indirect approach, however, the vehicle vibration includes the effect of road surface roughness, which makes it difficult to extract the bridge modal properties. One solution may be subtracting signals of two trailers towed by a vehicle to reduce the effect of road surface roughness. A simplified vehicle-bridge interaction model is used in the numerical simulation; the vehicle - trailer and bridge system are modeled as a coupled model. In addition, a laboratory experiment is carried out to verify results of the simulation and examine feasibility of the damage detection by the indirect method.
Resumo:
How much should an individual invest in immunity as it grows older? Immunity is costly and its value is likely to change across an organism's lifespan. A limited number of studies have focused on how personal immune investment changes with age in insects, but we do not know how social immunity, immune responses that protect kin, changes across lifespan, or how resources are divided between these two arms of the immune response. In this study, both personal and social immune functions are considered in the burying beetle, Nicrophorus vespilloides. We show that personal immune function declines (phenoloxidase levels) or is maintained (defensin expression) across lifespan in nonbreeding beetles but is maintained (phenoloxidase levels) or even upregulated (defensin expression) in breeding individuals. In contrast, social immunity increases in breeding burying beetles up to middle age, before decreasing in old age. Social immunity is not affected by a wounding challenge across lifespan, whereas personal immunity, through PO, is upregulated following wounding to a similar extent across lifespan. Personal immune function may be prioritized in younger individuals in order to ensure survival until reproductive maturity. If not breeding, this may then drop off in later life as state declines. As burying beetles are ephemeral breeders, breeding opportunities in later life may be rare. When allowed to breed, beetles may therefore invest heavily in "staying alive" in order to complete what could potentially be their final reproductive opportunity. As parental care is important for the survival and growth of offspring in this genus, staying alive to provide care behaviors will clearly have fitness payoffs. This study shows that all immune traits do not senesce at the same rate. In fact, the patterns observed depend upon the immune traits measured and the breeding status of the individual.
Resumo:
In this research, an agent-based model (ABM) was developed to generate human movement routes between homes and water resources in a rural setting, given commonly available geospatial datasets on population distribution, land cover and landscape resources. ABMs are an object-oriented computational approach to modelling a system, focusing on the interactions of autonomous agents, and aiming to assess the impact of these agents and their interactions on the system as a whole. An A* pathfinding algorithm was implemented to produce walking routes, given data on the terrain in the area. A* is an extension of Dijkstra's algorithm with an enhanced time performance through the use of heuristics. In this example, it was possible to impute daily activity movement patterns to the water resource for all villages in a 75 km long study transect across the Luangwa Valley, Zambia, and the simulated human movements were statistically similar to empirical observations on travel times to the water resource (Chi-squared, 95% confidence interval). This indicates that it is possible to produce realistic data regarding human movements without costly measurement as is commonly achieved, for example, through GPS, or retrospective or real-time diaries. The approach is transferable between different geographical locations, and the product can be useful in providing an insight into human movement patterns, and therefore has use in many human exposure-related applications, specifically epidemiological research in rural areas, where spatial heterogeneity in the disease landscape, and space-time proximity of individuals, can play a crucial role in disease spread.
Resumo:
BACKGROUND: Diabetic retinopathy is an important cause of visual loss. Laser photocoagulation preserves vision in diabetic retinopathy but is currently used at the stage of proliferative diabetic retinopathy (PDR).
OBJECTIVES: The primary aim was to assess the clinical effectiveness and cost-effectiveness of pan-retinal photocoagulation (PRP) given at the non-proliferative stage of diabetic retinopathy (NPDR) compared with waiting until the high-risk PDR (HR-PDR) stage was reached. There have been recent advances in laser photocoagulation techniques, and in the use of laser treatments combined with anti-vascular endothelial growth factor (VEGF) drugs or injected steroids. Our secondary questions were: (1) If PRP were to be used in NPDR, which form of laser treatment should be used? and (2) Is adjuvant therapy with intravitreal drugs clinically effective and cost-effective in PRP?
ELIGIBILITY CRITERIA: Randomised controlled trials (RCTs) for efficacy but other designs also used.
REVIEW METHODS: Systematic review and economic modelling.
RESULTS: The Early Treatment Diabetic Retinopathy Study (ETDRS), published in 1991, was the only trial designed to determine the best time to initiate PRP. It randomised one eye of 3711 patients with mild-to-severe NPDR or early PDR to early photocoagulation, and the other to deferral of PRP until HR-PDR developed. The risk of severe visual loss after 5 years for eyes assigned to PRP for NPDR or early PDR compared with deferral of PRP was reduced by 23% (relative risk 0.77, 99% confidence interval 0.56 to 1.06). However, the ETDRS did not provide results separately for NPDR and early PDR. In economic modelling, the base case found that early PRP could be more effective and less costly than deferred PRP. Sensitivity analyses gave similar results, with early PRP continuing to dominate or having low incremental cost-effectiveness ratio. However, there are substantial uncertainties. For our secondary aims we found 12 trials of lasers in DR, with 982 patients in total, ranging from 40 to 150. Most were in PDR but five included some patients with severe NPDR. Three compared multi-spot pattern lasers against argon laser. RCTs comparing laser applied in a lighter manner (less-intensive burns) with conventional methods (more intense burns) reported little difference in efficacy but fewer adverse effects. One RCT suggested that selective laser treatment targeting only ischaemic areas was effective. Observational studies showed that the most important adverse effect of PRP was macular oedema (MO), which can cause visual impairment, usually temporary. Ten trials of laser and anti-VEGF or steroid drug combinations were consistent in reporting a reduction in risk of PRP-induced MO.
LIMITATION: The current evidence is insufficient to recommend PRP for severe NPDR.
CONCLUSIONS: There is, as yet, no convincing evidence that modern laser systems are more effective than the argon laser used in ETDRS, but they appear to have fewer adverse effects. We recommend a trial of PRP for severe NPDR and early PDR compared with deferring PRP till the HR-PDR stage. The trial would use modern laser technologies, and investigate the value adjuvant prophylactic anti-VEGF or steroid drugs.
STUDY REGISTRATION: This study is registered as PROSPERO CRD42013005408.
FUNDING: The National Institute for Health Research Health Technology Assessment programme.
Resumo:
Hidden Markov models (HMMs) are widely used probabilistic models of sequential data. As with other probabilistic models, they require the specification of local conditional probability distributions, whose assessment can be too difficult and error-prone, especially when data are scarce or costly to acquire. The imprecise HMM (iHMM) generalizes HMMs by allowing the quantification to be done by sets of, instead of single, probability distributions. iHMMs have the ability to suspend judgment when there is not enough statistical evidence, and can serve as a sensitivity analysis tool for standard non-stationary HMMs. In this paper, we consider iHMMs under the strong independence interpretation, for which we develop efficient inference algorithms to address standard HMM usage such as the computation of likelihoods and most probable explanations, as well as performing filtering and predictive inference. Experiments with real data show that iHMMs produce more reliable inferences without compromising the computational efficiency.
Resumo:
The next generation sequencing revolution has enabled rapid discovery of genetic markers, however, development of fully functioning new markers still requires a long and costly process of marker validation. This study reports a rapid and economical approach for the validation and deployment of polymorphic microsatellite markers obtained from a 454 pyrosequencing library of Atlantic cod, Gadus morhua, Linnaeus 1758. Primers were designed from raw reads to amplify specific amplicon size ranges, allowing effective PCR multiplexing. Multiplexing was combined with a three-primer PCR approach using four universal tails to label amplicons with separate fluorochromes. A total of 192 primer pairs were tested, resulting in 73 polymorphic markers. Of these, 55 loci were combined in six multiplex panels each containing between six and eleven markers. Variability of the loci was assessed on G. morhua from the Celtic Sea (n 46) and the Scotian Shelf (n 46), two locations that have shown genetic differentiation in previous studies. Multilocus FST between the two samples was estimated at 0.067 (P 0.001). After three loci potentially under selection were excluded, the global FST was estimated at 0.043 (P 0.001). Our technique combines three- primer and multiplex PCR techniques, allowing simultaneous screening and validation of relatively large numbers of microsatellite loci.
Resumo:
Approximate execution is a viable technique for environments with energy constraints, provided that applications are given the mechanisms to produce outputs of the highest possible quality within the available energy budget. This paper introduces a framework for energy-constrained execution with controlled and graceful quality loss. A simple programming model allows developers to structure the computation in different tasks, and to express the relative importance of these tasks for the quality of the end result. For non-significant tasks, the developer can also supply less costly, approximate versions. The target energy consumption for a given execution is specified when the application is launched. A significance-aware runtime system employs an application-specific analytical energy model to decide how many cores to use for the execution, the operating frequency for these cores, as well as the degree of task approximation, so as to maximize the quality of the output while meeting the user-specified energy constraints. Evaluation on a dual-socket 16-core Intel platform using 9 benchmark kernels shows that the proposed framework picks the optimal configuration with high accuracy. Also, a comparison with loop perforation (a well-known compile-time approximation technique), shows that the proposed framework results in significantly higher quality for the same energy budget.
Resumo:
Limited access to bank branches excludes over one billion people from accessing financial services in developing countries. Digital financial services offered by banks and mobile money providers through agents can solve this problem without the need for complex and costly physical banking infrastructures. Delivering digital financial services through agents requires a legal framework to regulate liability. This article analyses whether vicarious liability of the principal is a more efficient regulatory approach than personal liability of the agent. Agent liability in Kenya, Fiji, and Malawi is analysed to demonstrate that vicarious liability of the principal, coupled to an explicit agreement as to agent rewards and penalties, is the more efficient regulatory approach.
Resumo:
During the last 30 years governments almost everywhere in the world are furthering a global neoliberal agenda by withdrawing the state from the delivery of services, decreasing social spending and lowering corporate taxation etc. This restructuring has led to a massive transfer of wealth from the welfare state and working class people into capital. In order to legitimize this restructuring conservative governments engage in collective blaming towards their denizens. This presentation will examine some of the well circulated phrases that have been used by the dominant elite in some countries during the last year to legitimize the imposition of austerity measures. Phrases such as, ‘We all partied’ used by the Irish finance minister, Brian Lenihan, to explain the Irish crisis and collectively blame all Irish people, ‘We must all share the pain’, deployed by another Irish Minister Gilmore and the UK coalition administration’s sound bite ‘We are all in this together’, legitimize the imposition of austerity measures. Utilizing the Gramscian concept of common sense (Gramsci, 1971), I call these phrases ‘austerity common sense’. They are austerity common sense because they both reflect and legitimate the austerity agenda. By deploying these phrases, the ruling economic and political elite seek to influence the perception of the people and pre-empt any intention of resistance. The dominant theme of these phrases is that there is no alternative and that austerity measures are somehow self-inflicted and, as such, should not be challenged because we are all to blame. The purpose of this presentation is to explore the “austerity common sense” theme from a Gramscian approach, focus on its implications for the social work profession and discuss the ways to resist the imposition of the global neoliberal agenda.
Resumo:
This paper builds on previous work to show how using holistic and iterative design optimisation tools can be used to produce a commercially viable product that reduces a costly assembly into a single moulded structure. An assembly consisting of a structural metallic support and compression moulding outer shell undergo design optimisation and analysis to remove the support from the assembly process in favour of a structural moulding. The support is analysed and a sheet moulded compound (SMC) alternative is presented, this is then combined into a manufacturable shell design which is then assessed on viability as an alternative to the original.
Alongside this a robust material selection system is implemented that removes user bias towards materials for designs. This system builds on work set out by the Cambridge Material Selector and Boothroyd and Dewhurst, while using a selection of applicable materials currently available for the compression moulding process. This material selection process has been linked into the design and analysis stage, via scripts for use in the finite element environment. This builds towards an analysis toolkit that is suggested to develop and enhance manufacturability of design studies.