923 resultados para classification aided by clustering


Relevância:

100.00% 100.00%

Publicador:

Resumo:

X-ray absorption spectroscopy (XAS) is a powerful means of investigation of structural and electronic properties in condensed -matter physics. Analysis of the near edge part of the XAS spectrum, the so – called X-ray Absorption Near Edge Structure (XANES), can typically provide the following information on the photoexcited atom: - Oxidation state and coordination environment. - Speciation of transition metal compounds. - Conduction band DOS projected on the excited atomic species (PDOS). Analysis of XANES spectra is greatly aided by simulations; in the most common scheme the multiple scattering framework is used with the muffin tin approximation for the scattering potential and the spectral simulation is based on a hypothetical, reference structure. This approach has the advantage of requiring relatively little computing power but in many cases the assumed structure is quite different from the actual system measured and the muffin tin approximation is not adequate for low symmetry structures or highly directional bonds. It is therefore very interesting and justified to develop alternative methods. In one approach, the spectral simulation is based on atomic coordinates obtained from a DFT (Density Functional Theory) optimized structure. In another approach, which is the object of this thesis, the XANES spectrum is calculated directly based on an ab – initio DFT calculation of the atomic and electronic structure. This method takes full advantage of the real many-electron final wavefunction that can be computed with DFT algorithms that include a core-hole in the absorbing atom to compute the final cross section. To calculate the many-electron final wavefunction the Projector Augmented Wave method (PAW) is used. In this scheme, the absorption cross section is written in function of several contributions as the many-electrons function of the finale state; it is calculated starting from pseudo-wavefunction and performing a reconstruction of the real-wavefunction by using a transform operator which contains some parameters, called partial waves and projector waves. The aim of my thesis is to apply and test the PAW methodology to the calculation of the XANES cross section. I have focused on iron and silicon structures and on some biological molecules target (myoglobin and cytochrome c). Finally other inorganic and biological systems could be taken into account for future applications of this methodology, which could become an important improvement with respect to the multiscattering approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work is to find a methodology in order to make possible the recycling of fines (0 - 4 mm) in the Construction and Demolition Waste (CDW) process. At the moment this fraction is a not desired by-product: it has high contaminant content, it has to be separated from the coarse fraction, because of its high water absorption which can affect the properties of the concrete. In fact, in some countries the use of fines recycled aggregates is highly restricted or even banned. This work is placed inside the European project C2CA (from Concrete to Cement and Clean Aggregates) and it has been held in the Faculty of Civil Engineering and Geosciences of the Technical University of Delft, in particular, in the laboratory of Resources And Recycling. This research proposes some procedures in order to close the loop of the entire recycling process. After the classification done by ADR (Advanced Dry Recovery) the two fractions "airknife" and "rotor" (that together constitute the fraction 0 - 4 mm) are inserted in a new machine that works at high temperatures. The temperatures analysed in this research are 600 °C and 750 °C, cause at that temperature it is supposed that the cement bounds become very weak. The final goal is "to clean" the coarse fraction (0,250 - 4 mm) from the cement still attached to the sand and try to concentrate the cement paste in the fraction 0 - 0,250 mm. This new set-up is able to dry the material in very few seconds, divide it into two fractions (the coarse one and the fine one) thanks to the air and increase the amount of fines (0 - 0,250 mm) promoting the attrition between the particles through a vibration device. The coarse fraction is then processed in a ball mill in order to improve the result and reach the final goal. Thanks to the high temperature it is possible to markedly reduce the milling time. The sand 0 - 2 mm, after being heated and milled is used to replace 100% of norm sand in mortar production. The results are very promising: the mortar made with recycled sand reaches an early strength, in fact the increment with respect to the mortar made with norm sand is 20% after three days and 7% after seven days. With this research it has been demonstrated that once the temperature is increased it is possible to obtain a clean coarse fraction (0,250 - 4 mm), free from cement paste that is concentrated in the fine fraction 0 - 0,250 mm. The milling time and the drying time can be largely reduced. The recycled sand shows better performance in terms of mechanical properties with respect to the natural one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Mismatches between pump output and venous return in a continuous-flow ventricular assist device may elicit episodes of ventricular suction. This research describes a series of in vitro experiments to characterize the operating conditions under which the EVAHEART centrifugal blood pump (Sun Medical Technology Research Corp., Nagano, Japan) can be operated with minimal concern regarding left ventricular (LV) suction. Methods: The pump was interposed into a pneumatically driven pulsatile mock circulatory system (MCS) in the ventricular apex to aorta configuration. Under varying conditions of preload, afterload, and systolic pressure, the speed of the pump was increased step-wise until suction was observed. Identification of suction was based on pump inlet pressure. Results: In the case of reduced LV systolic pressure, reduced preload (=10 mmHg), and afterload (=60 mmHg), suction was observed for speeds =2,200 rpm. However, suction did not occur at any speed (up to a maximum speed of 2,400 rpm) when preload was kept within 10-14 mmHg and afterload =80 mmHg. Although in vitro experiments cannot replace in vivo models, the results indicated that ventricular suction can be avoided if sufficient preload and afterload are maintained. Conclusion: Conditions of hypovolemia and/or hypotension may increase the risk of suction at the highest speeds, irrespective of the native ventricular systolic pressure. However, in vitro guidelines are not directly transferrable to the clinical situation; therefore, patient-specific evaluation is recommended, which can be aided by ultrasonography at various points in the course of support.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Echicetin, a heterodimeric snake C-type lectin from Echis carinatus, is known to bind specifically to platelet glycoprotein (GP)Ib. We now show that, in addition, it agglutinates platelets in plasma and induces platelet signal transduction. The agglutination is caused by binding to a specific protein in plasma. The protein was isolated from plasma and shown to cause platelet agglutination when added to washed platelets in the presence of echicetin. It was identified as immunoglobulin Mkappa (IgMkappa) by peptide sequencing and dot blotting with specific heavy and light chain anti-immunoglobulin reagents. Platelet agglutination by clustering echicetin with IgMkappa induced P-selectin expression and activation of GPIIb/IIIa as well as tyrosine phosphorylation of several signal transduction molecules, including p53/56(LYN), p64, p72(SYK), p70 to p90, and p120. However, neither ethylenediaminetetraacetic acid nor specific inhibition of GPIIb/IIIa affected platelet agglutination or activation by echicetin. Platelet agglutination and induction of signal transduction could also be produced by cross-linking biotinylated echicetin with avidin. These data indicate that clustering of GPIb alone is sufficient to activate platelets. In vivo, echicetin probably activates platelets rather than inhibits platelet activation, as previously proposed, accounting for the observed induction of thrombocytopenia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the performance gap between microprocessors and memory continues to increase, main memory accesses result in long latencies which become a factor limiting system performance. Previous studies show that main memory access streams contain significant localities and SDRAM devices provide parallelism through multiple banks and channels. These locality and parallelism have not been exploited thoroughly by conventional memory controllers. In this thesis, SDRAM address mapping techniques and memory access reordering mechanisms are studied and applied to memory controller design with the goal of reducing observed main memory access latency. The proposed bit-reversal address mapping attempts to distribute main memory accesses evenly in the SDRAM address space to enable bank parallelism. As memory accesses to unique banks are interleaved, the access latencies are partially hidden and therefore reduced. With the consideration of cache conflict misses, bit-reversal address mapping is able to direct potential row conflicts to different banks, further improving the performance. The proposed burst scheduling is a novel access reordering mechanism, which creates bursts by clustering accesses directed to the same rows of the same banks. Subjected to a threshold, reads are allowed to preempt writes and qualified writes are piggybacked at the end of the bursts. A sophisticated access scheduler selects accesses based on priorities and interleaves accesses to maximize the SDRAM data bus utilization. Consequentially burst scheduling reduces row conflict rate, increasing and exploiting the available row locality. Using a revised SimpleScalar and M5 simulator, both techniques are evaluated and compared with existing academic and industrial solutions. With SPEC CPU2000 benchmarks, bit-reversal reduces the execution time by 14% on average over traditional page interleaving address mapping. Burst scheduling also achieves a 15% reduction in execution time over conventional bank in order scheduling. Working constructively together, bit-reversal and burst scheduling successfully achieve a 19% speedup across simulated benchmarks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

New volumetric and mass flux estimates have been calculated for the Kenya Rift. Spatial and temporal histories for volcanic eruptions, lacustrine deposition, and hominin fossil sites are presented, aided by the compilation of a new digital geologic map. Distribution of volcanism over time indicates several periods of southward expansion followed by relative positional stasis. Volcanism occurs throughout the activated rift length, with no obvious abandonment as the rift system migrated. The main exception is a period of volcanic concentration around 10 Ma, when activity was constrained within 2° of the equator. Volumes derived from seismic data indicate a total volume of c. 310,000 km3 (2.47 x 1010 kg/yr ), which is significantly more than the map-derived volumes found here or published previously. Map-based estimates are likely affected by a bias against recognizing small volume events in the older record. Such events are, however, the main driver of erupted volume over the last 5 Ma. A technique developed here to counter this bias results in convergence of the two volume estimation techniques. Relative erupted composition over time is variable. Overall, the erupted material has a mafic to silicic ratio of 0.9:1. Basalts are distinctly more common in the Turkana region, which previously experienced Mesozoic rifting. Despite the near equal ratio of mafic to silicic products, the Kenya Rift otherwise fits the definition of a SLIP. It is proposed that the compositions would better fit the published definition if the Turkana region was not twice-rifted. Lacustrine sedimentation post-dates initial volcanism by about 5 million years, and follows the same volcanic trends, showing south and eastward migration over time. This sedimentation delay is likely related to timing of fault displacements. Evidence of hominin habitation is distinctly abundant in the northern and southern sections of the Kenya Rift, but there is an observed gap in the equatorial rift between 4 and 0.5 million years ago. After 0.5 Ma, sites appear to progress towards the equator. The pattern and timing of hominid site distributions suggests that the equatorial gap in habitation may be the result of active volcanic avoidance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The VirB/D4 type IV secretion system (T4SS) of Agrobacterium tumefaciens functions to transfer substrates to infected plant cells through assembly of a translocation channel and a surface structure termed a T-pilus. This thesis is focused on identifying contributions of VirB10 to substrate transfer and T-pilus formation through a mutational analysis. VirB10 is a bitopic protein with several domains, including a: (i) cytoplasmic N-terminus, (ii) single transmembrane (TM) α-helix, (iii) proline-rich region (PRR), and (iv) large C-terminal modified β-barrel. I introduced cysteine insertion and substitution mutations throughout the length of VirB10 in order to: (i) test a predicted transmembrane topology, (ii) identify residues/domains contributing to VirB10 stability, oligomerization, and function, and (iii) monitor structural changes accompanying energy activation or substrate translocation. These studies were aided by recent structural resolution of a periplasmic domain of a VirB10 homolog and a ‘core’ complex composed of homologs of VirB10 and two outer membrane associated subunits, VirB7 and VirB9. By use of the substituted cysteine accessibility method (SCAM), I confirmed the bitopic topology of VirB10. Through phenotypic studies of Ala-Cys insertion mutations, I identified “uncoupling” mutations in the TM and β-barrel domains that blocked T-pilus assembly but permitted substrate transfer. I showed that cysteine replacements in the C-terminal periplasmic domain yielded a variety of phenotypes in relation to protein accumulation, oligomerization, substrate transfer, and T-pilus formation. By SCAM, I also gained further evidence that VirB10 adopts different structural states during machine biogenesis. Finally, I showed that VirB10 supports substrate transfer even when its TM domain is extensively mutagenized or substituted with heterologous TM domains. By contrast, specific residues most probably involved in oligomerization of the TM domain are required for biogenesis of the T-pilus.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transapical transcatheter aortic valve implantation (TA-TAVI) is the recognized first choice surgical TAVI access. Expansion of this well-established treatment modality with subsequent broader patient inclusion has accelerated development of second-generation TA-TAVI devices. The Swiss ACURATE TA Symetis valve allows for excellent anatomical positioning, resulting in a very low incidence of paravalvular leaks. The self-expanding stent features an hourglass shape to wedge the native aortic valve annulus. A specially designed delivery system facilitates controlled release aided by tactile operator feedback. The ACURATE TA valve made of three native porcine non-coronary leaflets has received CE approval in September 2011. Since then, this valve is the third most frequently implanted TAVI device with over 1200 implants in Europe and South America. Results from the Symetis ACURATE TA™ Valve Implantation ('SAVI') Registry showed a procedural success rate of 98.0% and a survival rate of 93.2% at 30 days. This presentation provides technical considerations and detailed procedural aspects of device implantation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lung cancer is a devastating disease with very poor prognosis. The design of better treatments for patients would be greatly aided by mouse models that closely resemble the human disease. The most common type of human lung cancer is adenocarcinoma with frequent metastasis. Unfortunately, current models for this tumor are inadequate due to the absence of metastasis. Based on the molecular findings in human lung cancer and metastatic potential of osteosarcomas in mutant p53 mouse models, I hypothesized that mice with both K-ras and p53 missense mutations might develop metastatic lung adenocarcinomas. Therefore, I incorporated both K-rasLA1 and p53RI72HΔg alleles into mouse lung cells to establish a more faithful model for human lung adenocarcinoma and for translational and mechanistic studies. Mice with both mutations ( K-rasLA1/+ p53R172HΔg/+) developed advanced lung adenocarcinomas with similar histopathology to human tumors. These lung adenocarcinomas were highly aggressive and metastasized to multiple intrathoracic and extrathoracic sites in a pattern similar to that seen in lung cancer patients. This mouse model also showed gender differences in cancer related death and developed pleural mesotheliomas in 23.2% of them. In a preclinical study, the new drug Erlotinib (Tarceva) decreased the number and size of lung lesions in this model. These data demonstrate that this mouse model most closely mimics human metastatic lung adenocarcinoma and provides an invaluable system for translational studies. ^ To screen for important genes for metastasis, gene expression profiles of primary lung adenocarcinomas and metastases were analyzed. Microarray data showed that these two groups were segregated in gene expression and had 79 highly differentially expressed genes (more than 2.5 fold changes and p<0.001). Microarray data of Bub1b, Vimentin and CCAM1 were validated in tumors by quantitative real-time PCR (QPCR). Bub1b , a mitotic checkpoint gene, was overexpressed in metastases and this correlated with more chromosomal abnormalities in metastatic cells. Vimentin, a marker of epithelial-mesenchymal transition (EMT), was also highly expressed in metastases. Interestingly, Twist, a key EMT inducer, was also highly upregulated in metastases by QPCR, and this significantly correlated with the overexpression of Vimentin in the same tumors. These data suggest EMT occurs in lung adenocarcinomas and is a key mechanism for the development of metastasis in K-ras LA1/+ p53R172HΔg/+ mice. Thus, this mouse model provides a unique system to further probe the molecular basis of metastatic lung cancer.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual cortex of macaque monkeys consists of a large number of cortical areas that span the occipital, parietal, temporal, and frontal lobes and occupy more than half of cortical surface. Although considerable progress has been made in understanding the contributions of many occipital areas to visual perceptual processing, much less is known concerning the specific functional contributions of higher areas in the temporal and frontal lobes. Previous behavioral and electrophysiological investigations have demonstrated that the inferotemporal cortex (IT) is essential to the animal's ability to recognize and remember visual objects. While it is generally recognized that IT consists of a number of anatomically and functionally distinct visual-processing areas, there remains considerable controversy concerning the precise number, size, and location of these areas. Therefore, the precise delineation of the cortical subdivisions of inferotemporal cortex is critical for any significant progress in the understanding of the specific contributions of inferotemporal areas to visual processing. In this study, anterograde and/or retrograde neuroanatomical tracers were injected into two visual areas in the ventral posterior and central portions of IT (areas PITv and CITvp) to elucidate the corticocortical connections of these areas with well known areas of occipital cortex and with less well understood regions of inferotemporal cortex. The locations of injection sites and the delineation of the borders of many occipital areas were aided by the pattern of interhemispheric connections, revealed following callosal transection and subsequent labeling with HRP. The resultant patterns of connections were represented on two-dimensional computational (CARET) and manual cortical maps and the laminar characteristics and density of the projection fields were quantified. The laminar and density features of these corticocortical connections demonstrate thirteen anatomically distinct subdivisions or areas distributed within the superior temporal sulcus and across the inferotemporal gyrus. These results serve to refine previous descriptions of inferotemporal areas, validate recently identified areas, and provide a new description of the hierarchical relationships among occipitotemporal cortical areas in macaques. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En el Tractatus de locis dialecticis, Alonso de la Veracruz (1507-1584) promueve una reforma de la teoría de los tópicos que se caracteriza por la combinación de algunos de los elementos más expresivos de tradiciones rivales y aparentemente antagónicas como es el caso de la dialéctica humanista y de la lógica escolástica. Para determinar si tales elementos estarían articulados de manera integrada y coherente en la teoría propuesta por Fray Alonso, evaluamos la compatibilidad entre las definiciones escolástica y agricoliana de tópico, visto que ambas igualmente fueron asimiladas por el Tractatus de locis dialecticis. En el estudio aquí presentado, defendemos que Alonso de la Veracruz concilia tales definiciones, concibiendo los tópicos bajo una perspectiva prioritariamente epistémica. En seguida, averiguamos lo que llevó a Fray Alonso a enaltecer las innovaciones de la dialéctica humanista y, al mismo tiempo, insistir en el uso de las proposiciones máximas, ignorando diversas críticas a ellas dirigidas por Rodolfo Agrícola. Acreditamos que tal dificultad puede ser dilucidada mediante la hipótesis de que Alonso de la Veracruz habría optado por el mantenimiento de las máximas en razón del importante papel por ellas ejercido en la fundamentación de las inferencias entimemáticas. Finalmente, investigamos las razones por las cuales la taxonomía alonsina de los tópicos no es inviabilizada por la presencia de trazos de las teorías escolástica y humanista de los tópicos. En ese contexto, asumimos que la clasificación de los tópicos sustentada por Fray Alonso no es corrompida por su carácter híbrido, una vez que ella compatibilizaría las doctrinas más relevantes de las tradiciones escolástica y humanista.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El Territorio hoy es visto como una totalidad organizada que no puede ser pensada separando cada uno de los elementos que la componen; cada uno de ellos es definido por su relación con los otros elementos. Así, un pensamiento que integra diferentes disciplinas y saberes comienza a manejar una realidad que lejos está de definir certezas inamovibles, y comienza a vislumbrar horizontes estratégicos. La adaptación a la no linealidad de las relaciones que se dan sobre el territorio, y la diferencia de velocidades en las que actúan los distintos actores, nos exige hacer de la flexibilidad una característica esencial de la metodología de planificación estratégica. La multi-causalidad de los fenómenos que estructuran el territorio nos obliga a construir criterios cualitativos, entendiendo que nos es imposible la medición de estas cadenas causales y su reconstrucción completa en el tiempo; sin dejar por ello de edificar un marco profundo de acción y transformación que responda a una realidad cierta y veraz. Los fenómenos producidos sobre el territorio nunca actúan de manera aislada, lo que implica una responsabilidad a la hora de comprender las sinergias y la restricción que afectan los resultados de los procesos desatados. La presente ponencia corresponde a la Segunda Fase del proceso de identificación estratégica de los proyectos Plan Estratégico Territorial (PET) que se inició en el año 2005; dicho Plan es llevada a cabo por la Subsecretaría de Planificación Territorial del Ministerio de Planificación Federal y fue abordado sobre la base de tres pretensiones: institucionalizar el ejercicio del pensamiento estratégico, fortalecer la metodología de trabajo transdisciplinaria y multisectorial, y diseñar un sistema de ponderación de proyectos estratégicos de infraestructura, tanto a nivel provincial como nacional, con una fuerte base cualitativa. Este proceso dio como resultado una cartera ponderada de proyectos de infraestructura conjuntamente con una metodología que permitió consolidar los equipos provinciales de planificación, tanto en su relación con los decisores políticos como con los actores de los múltiples sectores del gobierno, y en estos resultados consolidar y reforzar una cultura del pensamiento estratégico sobre el territorio

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It reflects about the role of public libraries in users' education to the knowledge of cyberviolence. This phenomenon is expressed through of the Information and Communication Technology (ICT), which are used to cause damage to victims. The work is based on three cases of cyberviolence: a murder, an extortion and one of pedophilia. The cases help to reflect about the necessity of intervention activities of the public libraries. The theoretical table that supplies to the analysis is aided by theoretical contributions of Information Science, Sociology and Anthropology. The results show the need of involvement of the public libraries in combating of cyberviolence, because they function as users' attraction center to access information on the Internet

Relevância:

100.00% 100.00%

Publicador:

Resumo:

El trabajo analiza la Revista de Psicología, publicación que editara de manera irregular el Departamento de Psicología de la Universidad Nacional de La Plata entre 1964 y 1983. En primer lugar, se analiza el contexto en el que surgió la revista en el marco de la organización de carreras universitarias de psicología. En segundo lugar se realiza un estudio bibliométrico en el cual se analizan las temáticas abordadas por la publicación, la productividad de los/as autores/as y las referencias. El trabajo analiza las temáticas abordadas y se centra en las características de los autores más productivos en los casi 20 años de existencia de la revista. En relación con las referencias se consideraron tres variables diferentes: los autores referenciados, el idioma de las referencias y la antigüedad media de las mismas. De acuerdo con la clasificación para la bibliografía psicológica propuesta por Montero y León (2002, 2005), se trata de un estudio ex - post facto retrospectivo. La población del estudio fueron los nueve números de la revista y el análisis de datos se apoyó en datos cuantitativos bibliométricos aun cuando se realizan interpretaciones también cualitativas