880 resultados para REGULATORY AGENCIES
Resumo:
Metronidazole is a BCS (Biopharmaceutics Classification System) class 1 drug, traditionally considered the choice drug in the infections treatment caused by protozoa and anaerobic microorganisms. This study aimed to evaluate bioequivalence between 2 different marketed 250 mg metronidazole immediate release tablets. A randomized, open-label, 2 x 2 crossover study was performed in healthy Brazilian volunteers under fasting conditions with a 7-day washout period. The formulations were administered as single oral dose and blood was sampled over 48 h. Metronidazole plasma concentrations were determined by a liquid chromatography mass spectrometry (LC-MS/MS) method. The plasma concentration vs. time profile was generated for each volunteer and the pharmacokinetic parameters C-max, T-max, AUC(0-t), AUC(0-infinity), k(e), and t(1/2) were calculated using a noncompartmental model. Bioequivalence between pharmaceutical formulations was determined by calculating 90% CIs (Confidence Intervall) for the ratios of C-max, AUC(0-t), and AUC(0-infinity) values for test and reference using log-transformed data. 22 healthy volunteers (11 men, 11 women; mean (SD) age, 28 (6.5) years [range, 21-45 years]; mean (SD) weight, 66 (9.3) kg [range, 51-81 kg]; mean (SD) height, 169 (6.5) cm [range, 156-186 cm]) were enrolled in and completed the study. The 90% CIs for C-max (0.92-1.06), AUC(0-t) (0.97-1.02), and AUC(0-infinity) (0.97-1.03) values for the test and reference products fitted in the interval of 0.80-1.25 proposed by most regulatory agencies, including the Brazilian agency ANVISA. No clinically significant adverse effects were reported. After pharmacokinetics analysis, it concluded that test 250 mg metronidazole formulation is bioequivalent to the reference product according to the Brazilian agency requirements.
Resumo:
While the pathology peer review/pathology working group (PWG) model has long been used in mammalian toxicologic pathology to ensure the accuracy, consistency, and objectivity of histopathology data, application of this paradigm to ecotoxicological studies has thus far been limited. In the current project, the PWG approach was used to evaluate histopathologic sections of gills, liver, kidney, and/or intestines from three previously published studies of diclofenac in trout, among which there was substantial variation in the reported histopathologic findings. The main objectives of this review process were to investigate and potentially reconcile these interstudy differences, and based on the results, to establish an appropriate no observed effect concentration (NOEC). Following a complete examination of all histologic sections and original diagnoses by a single experienced fish pathologist (pathology peer review), a two-day PWG session was conducted to allow members of a four-person expert panel to determine the extent of treatment-related findings in each of the three trout studies. The PWG was performed according to the United States Environmental Protection Agency (US EPA) Pesticide Regulation (PR) 94-5 (EPA Pesticide Regulation, 1994). In accordance with standard procedures, the PWG review was conducted by the non-voting chairperson in a manner intended to minimize bias, and thus during the evaluation, the four voting panelists were unaware of the treatment group status of individual fish and the original diagnoses associated with the histologic sections. Based on the results of this review, findings related to diclofenac exposure included minimal to slightly increased thickening of the gill filament tips in fish exposed to the highest concentration tested (1,000 μg/L), plus a previously undiagnosed finding, decreased hepatic glycogen, which also occurred at the 1,000 μg/L dose level. The panel found little evidence to support other reported effects of diclofenac in trout, and thus the overall NOEC was determined to be >320 μg/L. By consensus, the PWG panel was able to identify diagnostic inconsistencies among and within the three prior studies; therefore this exercise demonstrated the value of the pathology peer review/PWG approach for assessing the reliability of histopathology results that may be used by regulatory agencies for risk assessment.
Resumo:
The Houston region is home to arguably the largest petrochemical and refining complex anywhere. The effluent of this complex includes many potentially hazardous compounds. Study of some of these compounds has led to recognition that a number of known and probable carcinogens are at elevated levels in ambient air. Two of these, benzene and 1,3-butadiene, have been found in concentrations which may pose health risk for residents of Houston.^ Recent popular journalism and publications by local research institutions has increased the interest of the public in Houston's air quality. Much of the literature has been critical of local regulatory agencies' oversight of industrial pollution. A number of citizens in the region have begun to volunteer with air quality advocacy groups in the testing of community air. Inexpensive methods exist for monitoring of ozone, particulate matter and airborne toxic ambient concentrations. This study is an evaluation of a technique that has been successfully applied to airborne toxics.^ This technique, solid phase microextraction (SPME), has been used to measure airborne volatile organic hydrocarbons at community-level concentrations. It is has yielded accurate and rapid concentration estimates at a relatively low cost per sample. Examples of its application to measurement of airborne benzene exist in the literature. None have been found for airborne 1,3-butadiene. These compounds were selected for an evaluation of SPME as a community-deployed technique, to replicate previous application to benzene, to expand application to 1,3-butadiene and due to the salience of these compounds in this community. ^ This study demonstrates that SPME is a useful technique for quantification of 1,3-butadiene at concentrations observed in Houston. Laboratory background levels precluded recommendation of the technique for benzene. One type of SPME fiber, 85 μm Carboxen/PDMS, was found to be a sensitive sampling device for 1,3-butadiene under temperature and humidity conditions common in Houston. This study indicates that these variables affect instrument response. This suggests the necessity of calibration within specific conditions of these variables. While deployment of this technique was less expensive than other methods of quantification of 1,3-butadiene, the complexity of calibration may exclude an SPME method from broad deployment by community groups.^
Resumo:
Phase I clinical trial is mainly designed to determine the maximum tolerated dose (MTD) of a new drug. Optimization of phase I trial design is crucial to minimize the number of enrolled patients exposed to unsafe dose levels and to provide reliable information to the later phases of clinical trials. Although it has been criticized about its inefficient MTD estimation, nowadays the traditional 3+3 method remains dominant in practice due to its simplicity and conservative estimation. There are many new designs that have been proven to generate more credible MTD estimation, such as the Continual Reassessment Method (CRM). Despite its accepted better performance, the CRM design is still not widely used in real trials. There are several factors that contribute to the difficulties of CRM adaption in practice. First, CRM is not widely accepted by the regulatory agencies such as FDA in terms of safety. It is considered to be less conservative and tend to expose more patients above the MTD level than the traditional design. Second, CRM is relatively complex and not intuitive for the clinicians to fully understand. Third, the CRM method take much more time and need statistical experts and computer programs throughout the trial. The current situation is that the clinicians still tend to follow the trial process that they are comfortable with. This situation is not likely to change in the near future. Based on this situation, we have the motivation to improve the accuracy of MTD selection while follow the procedure of the traditional design to maintain simplicity. We found that in 3+3 method, the dose transition and the MTD determination are relatively independent. Thus we proposed to separate the two stages. The dose transition rule remained the same as 3+3 method. After getting the toxicity information from the dose transition stage, we combined the isotonic transformation to ensure the monotonic increasing order before selecting the optimal MTD. To compare the operating characteristics of the proposed isotonic method and the other designs, we carried out 10,000 simulation trials under different dose setting scenarios to compare the design characteristics of the isotonic modified method with standard 3+3 method, CRM, biased coin design (BC) and k-in-a-row design (KIAW). The isotonic modified method improved MTD estimation of the standard 3+3 in 39 out of 40 scenarios. The improvement is much greater when the target is 0.3 other than 0.25. The modified design is also competitive when comparing with other selected methods. A CRM method performed better in general but was not as stable as the isotonic method throughout the different dose settings. The results demonstrated that our proposed isotonic modified method is not only easily conducted using the same procedure as 3+3 but also outperforms the conventional 3+3 design. It can also be applied to determine MTD for any given TTL. These features make the isotonic modified method of practical value in phase I clinical trials.^
Resumo:
Extensive spatial and temporal surveys, over 15 years, have been conducted in soil in urban parks and street dusts in one of the most polluted cities in western Europe, Avilés (NW Spain). The first survey was carried out in 1996, and since then monitoring has been undertaken every five years. Whilst the sampling site is a relatively small town, industrial activities (mainly the steel industry and Zn and Al metallurgy) and other less significant urban sources, such as traffic, strongly affect the load of heavy metals in the urban aerosol. Elemental tracers have been used to characterise the influence of these sources on the composition of soil and dust. Although PM10 has decreased over these years as a result of environmental measures undertaken in the city, some of the “industrial” elements still remain in concentrations of concern for example, up to 4.6% and 0.5% of Zn in dust and soil, respectively. Spatial trends in metals such as Zn and Cd clearly reflect sources from the processing industries. The concentrations of these elements across Europe have reduced over time, however the most recent results from Avilés revealed an upward trend in concentration for Zn, Cd, Hg and As. A risk assessment of the soil highlighted As as an element of concern since its cancer risk in adults was more than double the value above which regulatory agencies deem it to be unacceptable. If children were considered to be the receptors, then the risk nearly doubles from this element.
Resumo:
As primeiras agências reguladoras foram criadas a partir da segunda metade dos anos 1990, e a mais recente delas, em 2005. Com as agências surgiram também os atores privados regulados, os usuários e consumidores, e uma nova forma de interação entre os Poderes Executivo, Legislativo e Judiciário. Esses atores participam e dão forma ao processo de aprendizagem institucional das agências. Passado o período de criação e após quase duas décadas de existência, é necessária uma visão crítica sobre as agências. Propõe-se, então, um método de avaliação regulatória a partir de três variáveis que serão decompostas em diversas subvariáveis (quesitos a serem respondidos objetivamente). A primeira variável, institucionalização, mede as regras aplicáveis à própria agência: características dos mandatos dos dirigentes, autonomia decisória, autonomia financeira e de gestão de pessoal. A segunda, procedimentalização, ocupa-se do processo de tomada de decisão da agência e de sua transparência. Ambas as variáveis procuram medir as agências do ponto de vista formal, a partir de normas aplicáveis (leis, decretos, resoluções, portarias etc.), e pela prática regulatória, com base nos fatos ocorridos demonstrados por meio de documentos oficiais (decretos de nomeação, decisões, relatórios de atividade das próprias agências etc.). A última variável, judicialização, aponta as várias vezes em que a decisão administrativa muda de status e o nível de confirmação dessas decisões pelo Poder Judiciário. O modelo teórico de avaliação das agências ora apresentado é aplicado e testado em três setores que são submetidos à regulação econômica e contam com forte presença de atores sociais e empresa estatal federal. Assim, as agências analisadas foram: Agência Nacional de Energia Elétrica ANEEL, Agência Nacional de Telecomunicações ANATEL e Agência Nacional de Aviação Civil ANAC. Em termos gerais, não é possível garantir a existência de um isoformismo entre essas agências, nem mesmo entre agências criadas em momentos diferentes e por presidentes distintos. Também não foi possível demonstrar que a interferência política seja uma marca de um único governo. A ANATEL, a melhor avaliada das três agências, destaca-se pelo rigor de suas normas que seu processo decisório reflete. A ANEEL e a ANAC tiveram uma avaliação mediana já que apresentaram avaliação sofrível quanto ao processo, mas mostraram ter instituições (regras) um pouco melhores.
Resumo:
Esta pesquisa investiga o contexto social do desenvolvimento da produção científica contábil brasileira, defendendo a tese de que os agentes, no decorrer do processo de divulgação de suas investigações, estão priorizando aspectos produtivistas e quantitativos e, consequentemente, deixando em segundo plano a preocupação qualitativa e epistemológica [vigilância crítica] de tal produção. Fundamentado na Teoria de Campos de Pierre Bourdieu, este estudo busca relacionar a socialização acadêmica, o habitus dos agentes imbricados no campo, a distribuição do capital científico na área contábil e as características epistemológicas das publicações científicas da área, para obtenção das evidências sobre a problemática levantada. Trata-se de um levantamento operacionalizado por meio de entrevista semiestruturada, com uma amostra de 9 respondentes e estudo documental, com uma amostra de 43 artigos. Os dados foram analisados com emprego da técnica de análise de conteúdo. Apoiando-se em Bourdieu (2004, 2008, 2009, 2011, 2013) foram encontradas evidências de que as teorias, conceitos, metodologias, técnicas e demais escolhas realizadas pelos pesquisadores da área contábil, na maioria das vezes, não passam de manobras estratégicas que visam conquistar, reforçar, assegurar ou derrubar o monopólio da autoridade científica, visando a obtenção de maior poder simbólico no campo. Com relação ao habitus dos agentes pertencentes ao campo científico contábil, constatou-se uma tendência ao produtivismo em consequência das determinações dos órgãos reguladores da pesquisa em contabilidade (CAPES) e das lutas simbólicas travadas no campo para obtenção da autoridade científica. No tocante à socialização acadêmica, reforçou-se a presença de condutas produtivistas, por meio dos programas de pós-graduação stricto sensu, que repassam aos agentes as regras do jogo científico, doutrinando-os na maneira de publicar grande quantidade de comunicações em pouco tempo e com menos custos. As análises epistemológicas puderam triangular os dois últimos constructos, a fim de lhes dar validade, e evidenciaram uma preferência por temáticas que envolvem a contabilidade destinada aos usuários externos e procedimentos contábeis destinados ao mercado financeiro, privilegiando a utilização de dados secundários, por meio de pesquisas documentais. Em termos metodológicos, constatou-se a presença unânime de estudos positivistas, com alguns aspectos empiristas, mostrando uma ausência de inovação em termos de pesquisas norteadas por abordagens metodológicas alternativas e utilização de modelos econométricos para explicar a realidade observada sem teoria para embasar e explicar esses modelos. Por fim, a distribuição do capital simbólico no campo, mostrou que individualmente nenhum agente desponta com maior capital científico, mas, institucionalmente, a FEA/USP ocupa essa posição de destaque. Por conseguinte, pôde-se concluir que o campo científico contábil permanece estagnado e sem grandes modificações teóricas, pelo fato do produtivismo e das lutas simbólicas no interior do campo; fatos esses que, de certa maneira, motivaram a criação de uma espécie de \"receita mágica para publicar\" ou \"formato ideal\" legitimado, institucionalizado e difícil de ser modificado, a não ser que ocorra uma revolução científica que mude o paradigma existente
Resumo:
As the population of Colorado continues to grow, the impacts from individual sewage disposal systems, or onsite wastewater systems (OWS), are becoming more apparent. Increased use of OWS impacts not only water quality but land use and development as well. These impacts have led to the need for a new generation of wastewater regulations in the state, a transition from the historic prescriptive requirements to a more progressive, performance-based system. A performance-based system will allow smarter growth, improved water quality, and cost savings for both the regulatory agencies and the OWS industry in Colorado. This project outlines the challenges and essential elements required to make this transition, and provides guidance on how to meet the challenges and overcome barriers to implementing a performance code in Colorado.
Resumo:
Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014
Resumo:
As part of the Governor's effort to streamline State government through improvements in the efficiency and effectiveness of operations, Executive Order 2004-06 ("EO6") provided for the reorganization (consolidation) of the Department of Insurance, Office of Banks and Real Estate, Department of Professional Regulation and Department of Financial Institutions. Through EO6 the four predecessor Agencies were abolished and a single new agency, The Department of Financial and Professional Regulation (hereafter referred to as "IDFPR") was created. The purpose of the consolidation of the four regulatory agencies was to allow for certain economies of scale to be realized primarily within the executive management and administrative functions. Additionally, the consolidation would increases the effectiveness of operations through the integration of certain duplicative functions within the four predecessor agencies without the denegration of the frontline functions. Beginning on or about July 1, 2004, the IDFPR began consolidation activities focusing primarily on the administrative functions of Executive Management, Fiscal and Accounting, General Counsel, Human Resources, Information Technology and Other Administrative Services. The underlying premise of the reorganization was that all improvements could be accomplished without the denegration of the frontline functions of the predecessor agencies. Accordingly, all powers, duties, rights, responsibilities and functions of the predecessor agencies migrated to IDFPR and the reorganization activities commenced July 1, 2004.
Resumo:
Description based on: 1911 (1908/10).
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
This paper applies Latour’s 1992 translation map as a device to explore the development of and recent conflict between two data standards for the exchange of business information – EDIFACT and XBRL. Our research is focussed in France, where EDIFACT is well established and XBRL is just emerging. The alliances supporting both standards are local and global. The French/European EDIFACT is promulgated through the United Nations while a consortium of national jurisdictions and companies has coalesced around the US initiated XBRL International (XII). We suggest cultural differences pose a barrier to co-operation between the two networks. Competing data standards create the risk of switching costs. The different technical characteristics of the standards are identified as raising implications for regulators and users. A key concern is the lack of co-ordination of data standard production and the mechanisms regulatory agencies use to choose platforms for electronic data submission.
Resumo:
Background - To assess potentially elevated cardiovascular risk related to new antihyperglycemic drugs in patients with type 2 diabetes, regulatory agencies require a comprehensive evaluation of the cardiovascular safety profile of new antidiabetic therapies. We assessed cardiovascular outcomes with alogliptin, a new inhibitor of dipeptidyl peptidase 4 (DPP-4), as compared with placebo in patients with type 2 diabetes who had had a recent acute coronary syndrome. Methods - We randomly assigned patients with type 2 diabetes and either an acute myocardial infarction or unstable angina requiring hospitalization within the previous 15 to 90 days to receive alogliptin or placebo in addition to existing antihyperglycemic and cardiovascular drug therapy. The study design was a double-blind, noninferiority trial with a prespecified noninferiority margin of 1.3 for the hazard ratio for the primary end point of a composite of death from cardiovascular causes, nonfatal myocardial infarction, or nonfatal stroke. Results - A total of 5380 patients underwent randomization and were followed for up to 40 months (median, 18 months). A primary end-point event occurred in 305 patients assigned to alogliptin (11.3%) and in 316 patients assigned to placebo (11.8%) (hazard ratio, 0.96; upper boundary of the one-sided repeated confidence interval, 1.16; P<0.001 for noninferiority). Glycated hemoglobin levels were significantly lower with alogliptin than with placebo (mean difference, -0.36 percentage points; P<0.001). Incidences of hypoglycemia, cancer, pancreatitis, and initiation of dialysis were similar with alogliptin and placebo. Conclusions - Among patients with type 2 diabetes who had had a recent acute coronary syndrome, the rates of major adverse cardiovascular events were not increased with the DPP-4 inhibitor alogliptin as compared with placebo. (Funded by Takeda Development Center Americas; EXAMINE ClinicalTrials.gov number, NCT00968708.)
Resumo:
Parkinson's disease is a complex heterogeneous disorder with urgent need for disease-modifying therapies. Progress in successful therapeutic approaches for PD will require an unprecedented level of collaboration. At a workshop hosted by Parkinson's UK and co-organized by Critical Path Institute's (C-Path) Coalition Against Major Diseases (CAMD) Consortiums, investigators from industry, academia, government and regulatory agencies agreed on the need for sharing of data to enable future success. Government agencies included EMA, FDA, NINDS/NIH and IMI (Innovative Medicines Initiative). Emerging discoveries in new biomarkers and genetic endophenotypes are contributing to our understanding of the underlying pathophysiology of PD. In parallel there is growing recognition that early intervention will be key for successful treatments aimed at disease modification. At present, there is a lack of a comprehensive understanding of disease progression and the many factors that contribute to disease progression heterogeneity. Novel therapeutic targets and trial designs that incorporate existing and new biomarkers to evaluate drug effects independently and in combination are required. The integration of robust clinical data sets is viewed as a powerful approach to hasten medical discovery and therapies, as is being realized across diverse disease conditions employing big data analytics for healthcare. The application of lessons learned from parallel efforts is critical to identify barriers and enable a viable path forward. A roadmap is presented for a regulatory, academic, industry and advocacy driven integrated initiative that aims to facilitate and streamline new drug trials and registrations in Parkinson's disease.