913 resultados para grid-based spatial data


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction. The DRIVER I project drew up a detailed report of European repositories based on data gathered in a survey in which Spain's participation was very low. This created a highly distorted image of the implementation of repositories in Spain. This study aims to analyse the current state of Spanish open-access institutional repositories and to describe their characteristics. Method. The data were gathered through a Web survey. The questionnaire was based on that used by DRIVER I: coverage; technical infrastructure and technical issues; institutional policies; services created; and stimulators and inhibitors for establishing, filling and maintaining their digital institutional repositories. Analysis. Data were tabulated and analysed systematically according responses obtained from the questionnaire and grouped by coverage. Results. Responses were obtained from 38 of the 104 institutions contacted, which had 29 institutional repositories. This represents 78.3% of the Spanish repositories according to the BuscaRepositorios directory. Spanish repositories contained mainly full-text materials (journal articles and doctoral theses) together with metadata. The software most used was DSpace, followed by EPrints. The metadata standard most used was Dublin Core. Spanish repositories offered more usage statistics and fewer author-oriented services than the European average. The priorities for the future development of the repositories are the need for clear policies on access to scientific production based on public funding and the need for quality control indicators. Conclusions.This is the first detailed study of Spanish institutional repositories. The key stimulants for establishing, filling and maintaining were, in order of importance, the increase of visibility and citation, the interest of decision-makers, simplicity of use and search services. On the other hand the main inhibitors identified were the absence of policies, the lack of integration with other national and international systems and the lack of awareness efforts among academia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The National Uniform Crime Reporting System began with 400 cities representing 20 million inhabitants in 43 states on January 1st, 1930. Since the establishment of the Uniform Crime Reporting Program, the volume, diversity, and complexity of crime steadily increased while the UCR program remained virtually unchanged. Recognizing the increasing need for more in-depth statistical information and the need to improve the methodology used for compiling, analyzing, auditing, and publishing the collected data, an extensive study of the Uniform Crime reports was undertaken. The objective of this study was to meet law enforcement needs into the 21st century. The result of the study was NIBRS (National Incident Based Reporting System). Adoption of the NIBRS system took place in the mid 1980’s and Iowa began organizational efforts to implement the system. Conversion to IBR (Incident Based Iowa Uniform Crime Reporting) was completed January 1, 1991, as part of a national effort to implement incident based crime reporting, coordinated by the Federal Bureau of Investigation and the Bureau of Justice Statistics of the U.S. Department of Justice. Iowa was the fifth state in the nation to be accepted as a certified “reporting state” of incident based crime data to the national system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les laves torrentielles sont l'un des vecteurs majeurs de sédiments en milieu montagneux. Leur comportement hydrogéomorphologique est contrôlé par des facteurs géologique, géomorphologique, topographique, hydrologique, climatique et anthropique. Si, en Europe, la recherche s'est plus focalisée sur les aspects hydrologiques que géomorphologiques de ces phénomènes, l'identification des volumes de sédiments potentiellement mobilisables au sein de petits systèmes torrentiels et des processus responsables de leur transfert est d'une importance très grande en termes d'aménagement du territoire et de gestion des dangers naturels. De plus, une corrélation entre des événements pluviométriques et l'occurrence de laves torrentielles n'est pas toujours établie et de nombreux événements torrentiels semblent se déclencher lorsqu'un seuil géomorphologique intrinsèque (degré de remplissage du chenal) au cours d'eau est atteint.Une méthodologie pragmatique a été développée pour cartographier les stocks sédimentaires constituant une source de matériaux pour les laves torrentielles, comme outil préliminaire à la quantification des volumes transportés par ces phénomènes. La méthode s'appuie sur des données dérivées directement d'analyses en environnement SIG réalisées sur des modèles numériques d'altitude de haute précision, de mesures de terrain et d'interprétation de photographies aériennes. La méthode a été conçue pour évaluer la dynamique des transferts sédimentaires, en prenant en compte le rôle des différents réservoirs sédimentaires, par l'application du concept de cascade sédimentaire sous un angle cartographique.Les processus de transferts sédimentaires ont été étudiés dans deux bassins versants des Alpes suisses (torrent du Bruchi, à Blatten beiNaters et torrent du Meretschibach, à Agarn). La cartographie géomorphologique a été couplée avec des mesures complémentaires permettant d'estimer les flux sédimentaires et les taux d'érosion (traçages de peinture, piquets de dénudation et utilisation du LiDAR terrestre). La méthode proposée se révèle innovatrice en comparaison avec la plupart des systèmes de légendes géomorphologiques existants, qui ne sont souvent pas adaptés pour cartographier de manière satisfaisante les systèmes géomorphologiques complexes et actifs que sont les bassins torrentiels. L'intérêt de cette méthode est qu'elle permet l'établissement d'une cascade sédimentaire, mais uniquement pour des systèmes où l'occurrence d'une lave torrentielle est contrôlé par le degré de remplissage en matériaux du chenal. Par ailleurs, le produit cartographique ne peut être directement utilisé pour la création de cartes de dangers - axées sur les zones de dépôt - mais revêt un intérêt pour la mise en place de mesures de correction et pour l'installation de systèmes de monitoring ou d'alerte.La deuxième partie de ce travail de recherche est consacrée à la cartographie géomorphologique. Une analyse a porté sur un échantillon de 146 cartes ou systèmes de légende datant des années 1950 à 2009 et réalisés dans plus de 40 pays. Cette analyse a permis de mettre en évidence la diversité des applications et des techniques d'élaboration des cartes géomorphologiques. - Debris flows are one of the most important vectors of sediment transfer in mountainous areas. Their hydro-geomorphological behaviour is conditioned by geological, geomorphological, topographical, hydrological, climatic and anthropic factors. European research in torrential systems has focused more on hydrological processes than on geomorphological processes acting as debris flow triggers. Nevertheless, the identification of sediment volumes that have the potential to be mobilised in small torrential systems, as well as the recognition of processes responsible for their mobilisation and transfer within the torrential system, are important in terms of land-use planning and natural hazard management. Moreover, a correlation between rainfall and debris flow occurrence is not always established and a number of debris flows seems to occur when a poorly understood geomorphological threshold is reached.A pragmatic methodology has been developed for mapping sediment storages that may constitute source zone of bed load transport and debris flows as a preliminary tool before quantifying their volumes. It is based on data directly derived from GIS analysis using high resolution DEM's, field measurements and aerial photograph interpretations. It has been conceived to estimate sediment transfer dynamics, taking into account the role of different sediment stores in the torrential system applying the concept of "sediment cascade" in a cartographic point of view.Sediment transfer processes were investigated in two small catchments in the Swiss Alps (Bruchi torrent, Blatten bei Naters and Meretschibach torrent, Agarn). Thorough field geomorphological mapping coupled with complementary measurements were conducted to estimate sediment fluxes and denudation rates, using various methods (reference coloured lines, wooden markers and terrestrial LiDAR). The proposed geomorphological mapping methodology is quite innovative in comparison with most legend systems that are not adequate for mapping active and complex geomorphological systems such as debris flow catchments. The interest of this mapping method is that it allows the concept of sediment cascade to be spatially implemented but only for supply-limited systems. The map cannot be used directly for the creation of hazard maps, focused on the deposition areas, but for the design of correction measures and the implementation of monitoring and warning systems.The second part of this work focuses on geomorphological mapping. An analysis of a sample of 146 (extracts of) maps or legend systems dating from the middle of the 20th century to 2009 - realised in more than 40 different countries - was carried out. Even if this study is not exhaustive, it shows a clear renewed interest for the discipline worldwide. It highlights the diversity of applications, techniques (scale, colours and symbology) used for their conception.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: The current study tested the applicability of Jessor's problem behavior theory (PBT) in national probability samples from Georgia and Switzerland. Comparisons focused on (1) the applicability of the problem behavior syndrome (PBS) in both developmental contexts, and (2) on the applicability of employing a set of theory-driven risk and protective factors in the prediction of problem behaviors. METHODS: School-based questionnaire data were collected from n = 18,239 adolescents in Georgia (n = 9499) and Switzerland (n = 8740) following the same protocol. Participants rated five measures of problem behaviors (alcohol and drug use, problems because of alcohol and drug use, and deviance), three risk factors (future uncertainty, depression, and stress), and three protective factors (family, peer, and school attachment). Final study samples included n = 9043 Georgian youth (mean age = 15.57; 58.8% females) and n = 8348 Swiss youth (mean age = 17.95; 48.5% females). Data analyses were completed using structural equation modeling, path analyses, and post hoc z-tests for comparisons of regression coefficients. RESULTS: Findings indicated that the PBS replicated in both samples, and that theory-driven risk and protective factors accounted for 13% and 10% in Georgian and Swiss samples, respectively in the PBS, net the effects by demographic variables. Follow-up z-tests provided evidence of some differences in the magnitude, but not direction, in five of six individual paths by country. CONCLUSION: PBT and the PBS find empirical support in these Eurasian and Western European samples; thus, Jessor's theory holds value and promise in understanding the etiology of adolescent problem behaviors outside of the United States.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Haematopoietic stem cell transplantation (HSCT) is a highly specialised procedure used to treat malignancies of the lymphohaematopoietic system as well as some acquired and inherited disorders of the blood. This analysis by the Swiss Blood Stem Cell Transplantation Group, based on data from 2008-2011, describes, treatment rates in Switzerland for specific indications and compares this with data from Germany, France, Italy and the Netherlands, corrected for the size of the population. Differences in transplant rates, in rates for particular indications, and in the use of specific transplant technologies such as use of unrelated donors, use of cord blood or mismatched family donors are described. These data are put in correlation with donor availability from international registries and with number of transplant teams and number of procedures per team all corrected for population size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Diagnosis and Recommendation Integrated System (DRIS) can improve interpretations of leaf analysis to determine the nutrient status. Diagnoses by this method require DRIS norms, which are however not known for oil content of soybean seeds. The aims of this study were to establish and test the DRIS method for oil content of soybean seed (maturity group II cultivars). Soybean leaves (207 samples) in the full flowering stage were analyzed for macro and micro-nutrients, and the DRIS was applied to assess the relationship between nutrient ratios and the seed oil content. Samples from experimental and farm field sites of the southernmost Brazilian state Rio Grande do Sul (28° - 29° southern latitude; 52° -53° western longitude) were assessed in two growing seasons (2007/2008 and 2008/2009). The DRIS norms related to seed oil content differed between the studied years. A unique DRIS norm was established for seed oil content higher than 18.68 % based on data of the 2007/2008 growing season. Higher DRIS indices of B, Ca, Mg and S were associated with a higher oil content, while the opposite was found for K, N and P. The DRIS can be used to evaluate the leaf nutrient status of soybean to improve the seed oil content of the crop.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to the difficulty of estimating water percolation in unsaturated soils, the purpose of this study was to estimate water percolation based on time-domain reflectometry (TDR). In two drainage lysimeters with different soil textures TDR probes were installed, forming a water monitoring system consisting of different numbers of probes. The soils were saturated and covered with plastic to prevent evaporation. Tests of internal drainage were carried out using a TDR 100 unit with constant dielectric readings (every 15 min). To test the consistency of TDR-estimated percolation levels in comparison with the observed leachate levels in the drainage lysimeters, the combined null hypothesis was tested at 5 % probability. A higher number of probes in the water monitoring system resulted in an approximation of the percolation levels estimated from TDR - based moisture data to the levels measured by lysimeters. The definition of the number of probes required for water monitoring to estimate water percolation by TDR depends on the soil physical properties. For sandy clay soils, three batteries with four probes installed at depths of 0.20, 0.40, 0.60, and 0.80 m, at a distance of 0.20, 0.40 and 0.6 m from the center of lysimeters were sufficient to estimate percolation levels equivalent to the observed. In the sandy loam soils, the observed and predicted percolation levels were not equivalent even when using four batteries with four probes each, at depths of 0.20, 0.40, 0.60, and 0.80 m.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

General Summary Although the chapters of this thesis address a variety of issues, the principal aim is common: test economic ideas in an international economic context. The intention has been to supply empirical findings using the largest suitable data sets and making use of the most appropriate empirical techniques. This thesis can roughly be divided into two parts: the first one, corresponding to the first two chapters, investigates the link between trade and the environment, the second one, the last three chapters, is related to economic geography issues. Environmental problems are omnipresent in the daily press nowadays and one of the arguments put forward is that globalisation causes severe environmental problems through the reallocation of investments and production to countries with less stringent environmental regulations. A measure of the amplitude of this undesirable effect is provided in the first part. The third and the fourth chapters explore the productivity effects of agglomeration. The computed spillover effects between different sectors indicate how cluster-formation might be productivity enhancing. The last chapter is not about how to better understand the world but how to measure it and it was just a great pleasure to work on it. "The Economist" writes every week about the impressive population and economic growth observed in China and India, and everybody agrees that the world's center of gravity has shifted. But by how much and how fast did it shift? An answer is given in the last part, which proposes a global measure for the location of world production and allows to visualize our results in Google Earth. A short summary of each of the five chapters is provided below. The first chapter, entitled "Unraveling the World-Wide Pollution-Haven Effect" investigates the relative strength of the pollution haven effect (PH, comparative advantage in dirty products due to differences in environmental regulation) and the factor endowment effect (FE, comparative advantage in dirty, capital intensive products due to differences in endowments). We compute the pollution content of imports using the IPPS coefficients (for three pollutants, namely biological oxygen demand, sulphur dioxide and toxic pollution intensity for all manufacturing sectors) provided by the World Bank and use a gravity-type framework to isolate the two above mentioned effects. Our study covers 48 countries that can be classified into 29 Southern and 19 Northern countries and uses the lead content of gasoline as proxy for environmental stringency. For North-South trade we find significant PH and FE effects going in the expected, opposite directions and being of similar magnitude. However, when looking at world trade, the effects become very small because of the high North-North trade share, where we have no a priori expectations about the signs of these effects. Therefore popular fears about the trade effects of differences in environmental regulations might by exaggerated. The second chapter is entitled "Is trade bad for the Environment? Decomposing worldwide SO2 emissions, 1990-2000". First we construct a novel and large database containing reasonable estimates of SO2 emission intensities per unit labor that vary across countries, periods and manufacturing sectors. Then we use these original data (covering 31 developed and 31 developing countries) to decompose the worldwide SO2 emissions into the three well known dynamic effects (scale, technique and composition effect). We find that the positive scale (+9,5%) and the negative technique (-12.5%) effect are the main driving forces of emission changes. Composition effects between countries and sectors are smaller, both negative and of similar magnitude (-3.5% each). Given that trade matters via the composition effects this means that trade reduces total emissions. We next construct, in a first experiment, a hypothetical world where no trade happens, i.e. each country produces its imports at home and does no longer produce its exports. The difference between the actual and this no-trade world allows us (under the omission of price effects) to compute a static first-order trade effect. The latter now increases total world emissions because it allows, on average, dirty countries to specialize in dirty products. However, this effect is smaller (3.5%) in 2000 than in 1990 (10%), in line with the negative dynamic composition effect identified in the previous exercise. We then propose a second experiment, comparing effective emissions with the maximum or minimum possible level of SO2 emissions. These hypothetical levels of emissions are obtained by reallocating labour accordingly across sectors within each country (under the country-employment and the world industry-production constraints). Using linear programming techniques, we show that emissions are reduced by 90% with respect to the worst case, but that they could still be reduced further by another 80% if emissions were to be minimized. The findings from this chapter go together with those from chapter one in the sense that trade-induced composition effect do not seem to be the main source of pollution, at least in the recent past. Going now to the economic geography part of this thesis, the third chapter, entitled "A Dynamic Model with Sectoral Agglomeration Effects" consists of a short note that derives the theoretical model estimated in the fourth chapter. The derivation is directly based on the multi-regional framework by Ciccone (2002) but extends it in order to include sectoral disaggregation and a temporal dimension. This allows us formally to write present productivity as a function of past productivity and other contemporaneous and past control variables. The fourth chapter entitled "Sectoral Agglomeration Effects in a Panel of European Regions" takes the final equation derived in chapter three to the data. We investigate the empirical link between density and labour productivity based on regional data (245 NUTS-2 regions over the period 1980-2003). Using dynamic panel techniques allows us to control for the possible endogeneity of density and for region specific effects. We find a positive long run elasticity of density with respect to labour productivity of about 13%. When using data at the sectoral level it seems that positive cross-sector and negative own-sector externalities are present in manufacturing while financial services display strong positive own-sector effects. The fifth and last chapter entitled "Is the World's Economic Center of Gravity Already in Asia?" computes the world economic, demographic and geographic center of gravity for 1975-2004 and compares them. Based on data for the largest cities in the world and using the physical concept of center of mass, we find that the world's economic center of gravity is still located in Europe, even though there is a clear shift towards Asia. To sum up, this thesis makes three main contributions. First, it provides new estimates of orders of magnitudes for the role of trade in the globalisation and environment debate. Second, it computes reliable and disaggregated elasticities for the effect of density on labour productivity in European regions. Third, it allows us, in a geometrically rigorous way, to track the path of the world's economic center of gravity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many soils have a hard-setting behavior, also known as cohesive or "coesos". In such soils, the penetration resistance increases markedly when dry and decreases considerably when moist, creating serious limitations for plant emergence and growth. To evaluate the level of structure degradation in hard-setting soils with different texture classes and to create an index for assessing soil hardness levels in hard-setting soils, six soil representative profiles were selected in the field in various regions of Brazil. The following indices were tested: S, which measures soil physical quality, and H , which analyzes the degree of hardness and the effective stress in the soil during drying. Both indices were calculated using previously described functions based on data from the water-retention curves for the soils. The hard-setting values identified in different soils of the Brazilian Coastal Tablelands have distinct compaction (hardness) levels and can be satisfactorily measured by the H index. The S index was adequate for evaluating the structural characteristics of the hard-setting soils, classifying them as suitable or poor for cultivation, but only when the moisture level of the soil was near the inflection point. The H index showed that increases in density in hard-setting soils result from increases in effective stress and not from the soil texture. Values for Bd > 1.48 kg dm-3 classify the soil as hard-setting, and the structural organization is considered "poor".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Aneurysmal subarachnoid haemorrhage (aSAH) is a haemorrhagic form of stroke and occurs in a younger population compared with ischaemic stroke or intracerebral haemorrhage. It accounts for a large proportion of productive life-years lost to stroke. Its surgical and medical treatment represents a multidisciplinary effort. Due to the complexity of the disease, the management remains difficult to standardise and quality of care is accordingly difficult to assess. OBJECTIVE: To create a registry to assess management parameters of patients treated for aSAH in Switzerland. METHODS: A cohort study was initiated with the aim to record characteristics of patients admitted with aSAH, starting January 1st 2009. Ethical committee approval was obtained or is pending from the institutional review boards of all centres. In the study period, seven Swiss hospitals (five university [U], two non-university medical centres) harbouring a neurosurgery department, an intensive care unit and an interventional neuroradiology team so far agreed to participate in the registry (Aarau, Basel [U], Bern [U], Geneva [U], Lausanne [U], St. Gallen, Zürich [U]). Demographic and clinical parameters are entered into a common database. DISCUSSION: This database will soon provide (1) a nationwide assessment of the current standard of care and (2) the outcomes for patients suffering from aSAH in Switzerland. Based on data from this registry, we can conduct cohort comparisons or design diagnostic or therapeutic studies on a national level. Moreover, a standardised registration system will allow healthcare providers to assess the quality of care.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents multiple kernel learning (MKL) regression as an exploratory spatial data analysis and modelling tool. The MKL approach is introduced as an extension of support vector regression, where MKL uses dedicated kernels to divide a given task into sub-problems and to treat them separately in an effective way. It provides better interpretability to non-linear robust kernel regression at the cost of a more complex numerical optimization. In particular, we investigate the use of MKL as a tool that allows us to avoid using ad-hoc topographic indices as covariables in statistical models in complex terrains. Instead, MKL learns these relationships from the data in a non-parametric fashion. A study on data simulated from real terrain features confirms the ability of MKL to enhance the interpretability of data-driven models and to aid feature selection without degrading predictive performances. Here we examine the stability of the MKL algorithm with respect to the number of training data samples and to the presence of noise. The results of a real case study are also presented, where MKL is able to exploit a large set of terrain features computed at multiple spatial scales, when predicting mean wind speed in an Alpine region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ramp metering has been successfully implemented in many states to improve traffic operations on freeways. Studies have documented the positive mobility and safety benefits of ramp metering. However, there have been no studies on the use of ramp metering for work zones. This report documents the results from the first deployment of temporary ramp meters in work zones in the United States. Temporary ramp meters were deployed at seven urban short-term work zones in Missouri. Safety measures such as driver compliance, merging behavior, and speed differentials were extracted from video-based field data. Mobility analysis was conducted using a calibrated simulation model and the total delays were obtained for under capacity, at capacity, and over capacity conditions. This evaluation suggests that temporary ramp meters should only be deployed at work zone locations where there is potential for congestion and turned on only during above-capacity conditions. The compliance analysis showed that non-compliance could be a major safety issue in the deployment of temporary ramp meters for under-capacity conditions. The use of a three-section instead of a traditional two-section signal head used for permanent ramp metering produced significantly higher compliance rates. Ramp metering decreased ramp platoons by increasing the percentage of single-vehicle merges to over 70% from under 50%. The accepted-merge-headway results were not statistically significant even though a slight shift towards longer headways was found with the use of ramp meters. Mobility analysis revealed that ramp metering produced delay savings for both mainline and ramp vehicles for work zones operating above capacity. On average a 24% decrease in total delay (mainline plus ramp) at low truck percentage and a 19% decrease in delay at high truck percentage conditions resulted from ramp metering.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kokouksen esitysten verkko-osoite: http://www.geoinfo.tuwien.ac.at/events/Euresco2000/gdgis.htm

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article describes the application of a recently developed general unknown screening (GUS) strategy based on LC coupled to a hybrid linear IT-triple quadrupole mass spectrometer (LC-MS/MS-LIT) for the simultaneous detection and identification of drug metabolites following in vitro incubation with human liver microsomes. The histamine H1 receptor antagonist loratadine was chosen as a model compound to demonstrate the interest of such approach, because of its previously described complex and extensive metabolism. Detection and mass spectral characterization were based on data-dependent acquisition, switching between a survey scan acquired in the ion-trapping Q3 scan mode with dynamic subtraction of background noise, and a dependent scan in the ion-trapping product ion scan mode of automatically selected parent ions. In addition, the MS(3) mode was used in a second step to confirm the structure of a few fragment ions. The sensitivity of the ion-trapping modes combined with the selectivity of the triple quadrupole modes allowed, with only one injection, the detection and identification of 17 phase I metabolites of loratadine. The GUS procedure used in this study may be applicable as a generic technique for the characterization of drug metabolites after in vitro incubation, as well as probably in vivo experiments.