932 resultados para GIS BASED PLANNING TOOLS
Resumo:
Dissertação de mestrado integrado em Engenharia Industrial
Resumo:
Identificación y caracterización del problema. Uno de los problemas más importantes asociados con la construcción de software es la corrección del mismo. En busca de proveer garantías del correcto funcionamiento del software, han surgido una variedad de técnicas de desarrollo con sólidas bases matemáticas y lógicas conocidas como métodos formales. Debido a su naturaleza, la aplicación de métodos formales requiere gran experiencia y conocimientos, sobre todo en lo concerniente a matemáticas y lógica, por lo cual su aplicación resulta costosa en la práctica. Esto ha provocado que su principal aplicación se limite a sistemas críticos, es decir, sistemas cuyo mal funcionamiento puede causar daños de magnitud, aunque los beneficios que sus técnicas proveen son relevantes a todo tipo de software. Poder trasladar los beneficios de los métodos formales a contextos de desarrollo de software más amplios que los sistemas críticos tendría un alto impacto en la productividad en tales contextos. Hipótesis. Contar con herramientas de análisis automático es un elemento de gran importancia. Ejemplos de esto son varias herramientas potentes de análisis basadas en métodos formales, cuya aplicación apunta directamente a código fuente. En la amplia mayoría de estas herramientas, la brecha entre las nociones a las cuales están acostumbrados los desarrolladores y aquellas necesarias para la aplicación de estas herramientas de análisis formal sigue siendo demasiado amplia. Muchas herramientas utilizan lenguajes de aserciones que escapan a los conocimientos y las costumbres usuales de los desarrolladores. Además, en muchos casos la salida brindada por la herramienta de análisis requiere cierto manejo del método formal subyacente. Este problema puede aliviarse mediante la producción de herramientas adecuadas. Otro problema intrínseco a las técnicas automáticas de análisis es cómo se comportan las mismas a medida que el tamaño y complejidad de los elementos a analizar crece (escalabilidad). Esta limitación es ampliamente conocida y es considerada crítica en la aplicabilidad de métodos formales de análisis en la práctica. Una forma de atacar este problema es el aprovechamiento de información y características de dominios específicos de aplicación. Planteo de objetivos. Este proyecto apunta a la construcción de herramientas de análisis formal para contribuir a la calidad, en cuanto a su corrección funcional, de especificaciones, modelos o código, en el contexto del desarrollo de software. Más precisamente, se busca, por un lado, identificar ambientes específicos en los cuales ciertas técnicas de análisis automático, como el análisis basado en SMT o SAT solving, o el model checking, puedan llevarse a niveles de escalabilidad superiores a los conocidos para estas técnicas en ámbitos generales. Se intentará implementar las adaptaciones a las técnicas elegidas en herramientas que permitan su uso a desarrolladores familiarizados con el contexto de aplicación, pero no necesariamente conocedores de los métodos o técnicas subyacentes. Materiales y métodos a utilizar. Los materiales a emplear serán bibliografía relevante al área y equipamiento informático. Métodos. Se emplearán los métodos propios de la matemática discreta, la lógica y la ingeniería de software. Resultados esperados. Uno de los resultados esperados del proyecto es la individualización de ámbitos específicos de aplicación de métodos formales de análisis. Se espera que como resultado del desarrollo del proyecto surjan herramientas de análisis cuyo nivel de usabilidad sea adecuado para su aplicación por parte de desarrolladores sin formación específica en los métodos formales utilizados. Importancia del proyecto. El principal impacto de este proyecto será la contribución a la aplicación práctica de técnicas formales de análisis en diferentes etapas del desarrollo de software, con la finalidad de incrementar su calidad y confiabilidad. A crucial factor for software quality is correcteness. Traditionally, formal approaches to software development concentrate on functional correctness, and tackle this problem basically by being based on well defined notations founded on solid mathematical grounds. This makes formal methods better suited for analysis, due to their precise semantics, but they are usually more complex, and require familiarity and experience with the manipulation of mathematical definitions. So, their acceptance by software engineers is rather restricted, and formal methods applications have been confined to critical systems. Nevertheless, it is obvious that the advantages that formal methods provide apply to any kind of software system. It is accepted that appropriate software tool support for formal analysis is essential, if one seeks providing support for software development based on formal methods. Indeed, some of the relatively recent sucesses of formal methods are accompanied by good quality tools that automate powerful analysis mechanisms, and are even integrated in widely used development environments. Still, most of these tools either concentrate on code analysis, and in many cases are still far from being simple enough to be employed by software engineers without experience in formal methods. Another important problem for the adoption of tool support for formal methods is scalability. Automated software analysis is intrinsically complex, and thus techniques do not scale well in the general case. In this project, we will attempt to identify particular modelling, design, specification or coding activities in software development processes where to apply automated formal analysis techniques. By focusing in very specific application domains, we expect to find characteristics that might be exploited to increase the scalability of the corresponding analyses, compared to the general case.
Resumo:
A partir de las últimas décadas se ha impulsado el desarrollo y la utilización de los Sistemas de Información Geográficos (SIG) y los Sistemas de Posicionamiento Satelital (GPS) orientados a mejorar la eficiencia productiva de distintos sistemas de cultivos extensivos en términos agronómicos, económicos y ambientales. Estas nuevas tecnologías permiten medir variabilidad espacial de propiedades del sitio como conductividad eléctrica aparente y otros atributos del terreno así como el efecto de las mismas sobre la distribución espacial de los rendimientos. Luego, es posible aplicar el manejo sitio-específico en los lotes para mejorar la eficiencia en el uso de los insumos agroquímicos, la protección del medio ambiente y la sustentabilidad de la vida rural. En la actualidad, existe una oferta amplia de recursos tecnológicos propios de la agricultura de precisión para capturar variación espacial a través de los sitios dentro del terreno. El óptimo uso del gran volumen de datos derivado de maquinarias de agricultura de precisión depende fuertemente de las capacidades para explorar la información relativa a las complejas interacciones que subyacen los resultados productivos. La covariación espacial de las propiedades del sitio y el rendimiento de los cultivos ha sido estudiada a través de modelos geoestadísticos clásicos que se basan en la teoría de variables regionalizadas. Nuevos desarrollos de modelos estadísticos contemporáneos, entre los que se destacan los modelos lineales mixtos, constituyen herramientas prometedoras para el tratamiento de datos correlacionados espacialmente. Más aún, debido a la naturaleza multivariada de las múltiples variables registradas en cada sitio, las técnicas de análisis multivariado podrían aportar valiosa información para la visualización y explotación de datos georreferenciados. La comprensión de las bases agronómicas de las complejas interacciones que se producen a la escala de lotes en producción, es hoy posible con el uso de éstas nuevas tecnologías. Los objetivos del presente proyecto son: (l) desarrollar estrategias metodológicas basadas en la complementación de técnicas de análisis multivariados y geoestadísticas, para la clasificación de sitios intralotes y el estudio de interdependencias entre variables de sitio y rendimiento; (ll) proponer modelos mixtos alternativos, basados en funciones de correlación espacial de los términos de error que permitan explorar patrones de correlación espacial de los rendimientos intralotes y las propiedades del suelo en los sitios delimitados. From the last decades the use and development of Geographical Information Systems (GIS) and Satellite Positioning Systems (GPS) is highly promoted in cropping systems. Such technologies allow measuring spatial variability of site properties including electrical conductivity and others soil features as well as their impact on the spatial variability of yields. Therefore, site-specific management could be applied to improve the efficiency in the use of agrochemicals, the environmental protection, and the sustainability of the rural life. Currently, there is a wide offer of technological resources to capture spatial variation across sites within field. However, the optimum use of data coming from the precision agriculture machineries strongly depends on the capabilities to explore the information about the complex interactions underlying the productive outputs. The covariation between spatial soil properties and yields from georeferenced data has been treated in a graphical manner or with standard geostatistical approaches. New statistical modeling capabilities from the Mixed Linear Model framework are promising to deal with correlated data such those produced by the precision agriculture. Moreover, rescuing the multivariate nature of the multiple data collected at each site, several multivariate statistical approaches could be crucial tools for data analysis with georeferenced data. Understanding the basis of complex interactions at the scale of production field is now within reach the use of these new techniques. Our main objectives are: (1) to develop new statistical strategies, based on the complementarities of geostatistics and multivariate methods, useful to classify sites within field grown with grain crops and analyze the interrelationships of several soil and yield variables, (2) to propose mixed linear models to predict yield according spatial soil variability and to build contour maps to promote a more sustainable agriculture.
Resumo:
Sustainable Development requires appropriate and continuous planning and management of economic, socio cultural and environmental resources. Tourism planning calls for continuous collaboration among tourism agencies, local authorities and local communities for success of the industry. While evidence suggests that tourism planning has been extensively documented, it is apparent that Donegal and Sligo County Councils have, in some cases failed to adequately address the significance of planning of the tourism industry for the North West of Ireland. This was investigated through interviews with chief planners of Donegal and Sligo county councils and was conducted in conjunction with the analysis of county development plans; which were formulated by both organisations involved in this study. Evidence suggests that although tourism is extensively documented by Donegal and Sligo county councils, neither of the two local authorities have developed implementation strategies to facilitate the promotion of sustainable tourism development. This research compares and analyses Donegal and Sligo county councils and how they plan for sustainable tourism development. It outlines the role of the county councils in relation to tourism planning and how Donegal and Sligo compare in how they plan for such a significant industry in the North West o f Ireland. It highlights the importance of implementation tools and methods and offers future directions that can assist in the development of sustainable tourism.
Resumo:
This research looked at the scientific evidence available on climate change and in particular, projections on sea level rise which ranged from 0.5m to 2m by the end of the century. These projections were then considered in an Irish context. A review of current policy in Ireland revealed that there was no dedicated Government policy on climate change or coastal zone management. In terms of spatial planning policy, it became apparent that there was little or no guidance on climate change either at a national, regional or local level. Therefore, to determine the likely impacts of sea level rise in Ireland based on current spatial planning practice and policy, a scenario-building exercise was carried out for two case study areas in Galway Bay. The two case study areas were: Oranmore, a densely populated town located to the east of Inner Galway Bay; and Tawin Island, a rural dispersed community, located to the south east of Inner Galway Bay. A ‘best’ and ‘worse’ case scenario was envisaged for both areas in terms of sea level rise. In the absence of specific climate change policies it was projected that in the ‘best’ case scenario of 0.5m sea level rise, Tawin Island would suffer serious and adverse impacts while Oranmore was likely to experience slight to moderate impacts. However, in the ‘worse’ case scenario of a 2m sea level rise, it was likely that Tawin Island would be abandoned while many houses, businesses and infrastructure built within the floodplain of Oranmore Bay would be inundated and permanently flooded. In this regard, it was the author’s opinion that a strategic and integrated climate change policy and adaptation plan is vital for the island of Ireland that recognises the importance of integrated land use and spatial planning in terms of mitigation and adaptation to climate change.
Resumo:
This research studies the phenomenon of national and corporate culture. National culture is the culture the members of a country share and corporate culture is a subculture which members of an organisation share (Schein, 1992). The objective of this research is to reveal if the employees within equivalent Irish and American companies share the same corporate and national culture and to ascertain if, within each company, there is a link between national culture and corporate culture. The object of this study is achieved by replicating research which was conducted by Shing (1997) in Taiwan. Hypotheses and analytical tools developed by Shing are employed in the current study to allow comparison of results between Shing’s study and the current study. The methodology used, called for the measurement and comparison of national and corporate culture in two equivalent companies within the same industry. The two companies involved in this study are both located in Ireland and are of American and Irish origin. A sample of three hundred was selected and the response rate was 54%. The findings from this research are: (1) The two companies involved had different corporate cultures, (2) They had the same national culture, (3) There was no link between national culture and corporate culture within either company, (4) The findings were not similar to those of Shing (1997). The implication of these findings is that national and corporate culture are separate phenomena therefore corporate culture is not a response to national culture. The results of this research are not reflected in the finding’s of Shing (1997), therefore they are context specific. The core recommendation for management is that, corporate culture should take account of national culture. This is because although employees recognise the espoused values of corporate culture (Schein, 1992), they are at the same time influenced by a much stronger force, their national culture.
Resumo:
Species distribution models (SDMs) are increasingly used to predict environmentally induced range shifts of habitats of plant and animal species. Consequently SDMs are valuable tools for scientifically based conservation decisions. The aims of this paper are (1) to identify important drivers of butterfly species persistence or extinction, and (2) to analyse the responses of endangered butterfly species of dry grasslands and wetlands to likely future landscape changes in Switzerland. Future land use was represented by four scenarios describing: (1) ongoing land use changes as observed at the end of the last century; (2) a liberalisation of the agricultural markets; (3) a slightly lowered agricultural production; and (4) a strongly lowered agricultural production. Two model approaches have been applied. The first (logistic regression with principal components) explains what environmental variables have significant impact on species presence (and absence). The second (predictive SDM) is used to project species distribution under current and likely future land uses. The results of the explanatory analyses reveal that four principal components related to urbanisation, abandonment of open land and intensive agricultural practices as well as two climate parameters are primary drivers of species occurrence (decline). The scenario analyses show that lowered agricultural production is likely to favour dry grassland species due to an increase of non-intensively used land, open canopy forests, and overgrown areas. In the liberalisation scenario dry grassland species show a decrease in abundance due to a strong increase of forested patches. Wetland butterfly species would decrease under all four scenarios as their habitats become overgrown
Resumo:
BACKGROUND: Cone-beam computed tomography (CBCT) image-guided radiotherapy (IGRT) systems are widely used tools to verify and correct the target position before each fraction, allowing to maximize treatment accuracy and precision. In this study, we evaluate automatic three-dimensional intensity-based rigid registration (RR) methods for prostate setup correction using CBCT scans and study the impact of rectal distension on registration quality. METHODS: We retrospectively analyzed 115 CBCT scans of 10 prostate patients. CT-to-CBCT registration was performed using (a) global RR, (b) bony RR, or (c) bony RR refined by a local prostate RR using the CT clinical target volume (CTV) expanded with 1-to-20-mm varying margins. After propagation of the manual CT contours, automatic CBCT contours were generated. For evaluation, a radiation oncologist manually delineated the CTV on the CBCT scans. The propagated and manual CBCT contours were compared using the Dice similarity and a measure based on the bidirectional local distance (BLD). We also conducted a blind visual assessment of the quality of the propagated segmentations. Moreover, we automatically quantified rectal distension between the CT and CBCT scans without using the manual CBCT contours and we investigated its correlation with the registration failures. To improve the registration quality, the air in the rectum was replaced with soft tissue using a filter. The results with and without filtering were compared. RESULTS: The statistical analysis of the Dice coefficients and the BLD values resulted in highly significant differences (p<10(-6)) for the 5-mm and 8-mm local RRs vs the global, bony and 1-mm local RRs. The 8-mm local RR provided the best compromise between accuracy and robustness (Dice median of 0.814 and 97% of success with filtering the air in the rectum). We observed that all failures were due to high rectal distension. Moreover, the visual assessment confirmed the superiority of the 8-mm local RR over the bony RR. CONCLUSION: The most successful CT-to-CBCT RR method proved to be the 8-mm local RR. We have shown the correlation between its registration failures and rectal distension. Furthermore, we have provided a simple (easily applicable in routine) and automatic method to quantify rectal distension and to predict registration failure using only the manual CT contours.
Resumo:
INTRODUCTION: Hip fractures are responsible for excessive mortality, decreasing the 5-year survival rate by about 20%. From an economic perspective, they represent a major source of expense, with direct costs in hospitalization, rehabilitation, and institutionalization. The incidence rate sharply increases after the age of 70, but it can be reduced in women aged 70-80 years by therapeutic interventions. Recent analyses suggest that the most efficient strategy is to implement such interventions in women at the age of 70 years. As several guidelines recommend bone mineral density (BMD) screening of postmenopausal women with clinical risk factors, our objective was to assess the cost-effectiveness of two screening strategies applied to elderly women aged 70 years and older. METHODS: A cost-effectiveness analysis was performed using decision-tree analysis and a Markov model. Two alternative strategies, one measuring BMD of all women, and one measuring BMD only of those having at least one risk factor, were compared with the reference strategy "no screening". Cost-effectiveness ratios were measured as cost per year gained without hip fracture. Most probabilities were based on data observed in EPIDOS, SEMOF and OFELY cohorts. RESULTS: In this model, which is mostly based on observed data, the strategy "screen all" was more cost effective than "screen women at risk." For one woman screened at the age of 70 and followed for 10 years, the incremental (additional) cost-effectiveness ratio of these two strategies compared with the reference was 4,235 euros and 8,290 euros, respectively. CONCLUSION: The results of this model, under the assumptions described in the paper, suggest that in women aged 70-80 years, screening all women with dual-energy X-ray absorptiometry (DXA) would be more effective than no screening or screening only women with at least one risk factor. Cost-effectiveness studies based on decision-analysis trees maybe useful tools for helping decision makers, and further models based on different assumptions should be performed to improve the level of evidence on cost-effectiveness ratios of the usual screening strategies for osteoporosis.
Resumo:
Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.
Resumo:
Precise focusing is essential for transcranial MRI-guided focused ultrasound (TcMRgFUS) to minimize collateral damage to non-diseased tissues and to achieve temperatures capable of inducing coagulative necrosis at acceptable power deposition levels. CT is usually used for this refocusing but requires a separate study (CT) ahead of the TcMRgFUS procedure. The goal of this study was to determine whether MRI using an appropriate sequence would be a viable alternative to CT for planning ultrasound refocusing in TcMRgFUS. We tested three MRI pulse sequences (3D T1 weighted 3D volume interpolated breath hold examination (VIBE), proton density weighted 3D sampling perfection with applications optimized contrasts using different flip angle evolution and 3D true fast imaging with steady state precision T2-weighted imaging) on patients who have already had a CT scan performed. We made detailed measurements of the calvarial structure based on the MRI data and compared those so-called 'virtual CT' to detailed measurements of the calvarial structure based on the CT data, used as a reference standard. We then loaded both standard and virtual CT in a TcMRgFUS device and compared the calculated phase correction values, as well as the temperature elevation in a phantom. A series of Bland-Altman measurement agreement analyses showed T1 3D VIBE as the optimal MRI sequence, with respect to minimizing the measurement discrepancy between the MRI derived total skull thickness measurement and the CT derived total skull thickness measurement (mean measurement discrepancy: 0.025; 95% CL (-0.22-0.27); p = 0.825). The T1-weighted sequence was also optimal in estimating skull CT density and skull layer thickness. The mean difference between the phase shifts calculated with the standard CT and the virtual CT reconstructed from the T1 dataset was 0.08 ± 1.2 rad on patients and 0.1 ± 0.9 rad on phantom. Compared to the real CT, the MR-based correction showed a 1 °C drop on the maximum temperature elevation in the phantom (7% relative drop). Without any correction, the maximum temperature was down 6 °C (43% relative drop). We have developed an approach that allows for a reconstruction of a virtual CT dataset from MRI to perform phase correction in TcMRgFUS.
Implementation of IPM programs on European greenhouse tomato production areas: Tools and constraints
Resumo:
Whiteflies and whitefly-transmitted viruses are some of the major constraints on European tomato production. The main objectives of this study were to: identify where and why whiteflies are a major limitation on tomato crops; collect information about whiteflies and associated viruses; determine the available management tools; and identify key knowledge gaps and research priorities. This study was conducted within the framework of ENDURE (European Network for Durable Exploitation of Crop Protection Strategies). Two whitefly species are the main pests of tomato in Europe: Bemisia tabaci and Trialeurodes vaporariorum. Trialeurodes vaporariorum is widespread to all areas where greenhouse industry is present, and B. tabaci has invaded, since the early 1990’s, all the subtropical and tropical areas. Biotypes B and Q of B. tabaci are widespread and especially problematic. Other key tomato pests are Aculops lycopersici, Helicoverpa armigera, Frankliniella occidentalis, and leaf miners. Tomato crops are particularly susceptible to viruses causingTomato yellow leaf curl disease (TYLCD). High incidences of this disease are associated to high pressure of its vector, B. tabaci. The ranked importance of B. tabaci established in this study correlates with the levels of insecticide use, showing B. tabaci as one of the principal drivers behind chemical control. Confirmed cases of resistance to almost all insecticides have been reported. Integrated Pest Management based on biological control (IPM-BC) is applied in all the surveyed regions and identified as the strategy using fewer insecticides. Other IPM components include greenhouse netting and TYLCD-tolerant tomato cultivars. Sampling techniques differ between regions, where decisions are generally based upon whitefly densities and do not relate to control strategies or growing cycles. For population monitoring and control, whitefly species are always identified. In Europe IPM-BC is the recommended strategy for a sustainable tomato production. The IPM-BC approach is mainly based on inoculative releases of the parasitoids Eretmocerus mundus and Encarsia formosa and/or the polyphagous predators Macrolophus caliginosus and Nesidiocoris tenuis. However, some limitations for a wider implementation have been identified: lack of biological solutions for some pests, costs of beneficials, low farmer confidence, costs of technical advice, and low pest injury thresholds. Research priorities to promote and improve IPM-BC are proposed on the following domains: (i) emergence and invasion of new whitefly-transmitted viruses; (ii) relevance of B. tabaci biotypes regarding insecticide resistance; (iii) biochemistry and genetics of plant resistance; (iv) economic thresholds and sampling techniques of whiteflies for decision making; and (v) conservation and management of native whitefly natural enemies and improvement of biological control of other tomato pests.
Resumo:
ABSTRACT The drug discovery process has been profoundly changed recently by the adoption of computational methods helping the design of new drug candidates more rapidly and at lower costs. In silico drug design consists of a collection of tools helping to make rational decisions at the different steps of the drug discovery process, such as the identification of a biomolecular target of therapeutical interest, the selection or the design of new lead compounds and their modification to obtain better affinities, as well as pharmacokinetic and pharmacodynamic properties. Among the different tools available, a particular emphasis is placed in this review on molecular docking, virtual high throughput screening and fragment-based ligand design.
Resumo:
El projecte ha tingut com a finalitat cobrir un buit en la formació dels estudiants pel que respecta a les seves limitacions en l’aprenentatge de les diferents estratègies i metodologies de treball. Aquest buit ens ha fet pensar en l’elaboració d’un sistema didàctic que possibiliti una comunicació fluida amb l’alumne i que pugui anar més enllà de l’àmbit específic de les aules. Es per això que la nostra proposta té com a referent un aprofitament més racional del temps de treball de l’estudiant. Per tal d’arribar a aquesta finalitat ens hem plantejat un grup d’objectius concrets ordenats segons els dos nivells d’assoliment següents: Nivell 1: - Definir un material didàctic que pugui transmetre de la manera més objectiva possible les eines necessàries per tal que l’alumne resolgui un conjunt de problemes concrets. - Proporcionar a l’estudiant un procediment didàctic basat en un grup de tasques amb un material de suport on ha de posar a prova l’aplicabilitat dels conceptes donats a les classes teòriques. - Partir d’un sistema que permeti a l’alumne tenir una referència constant de recolzament teòric fora de les aules quan està resolent les pràctiques assignades en el curs. - Definir unes característiques funcionals del material utilitzat en l’activitat pràctica de tal manera que sigui un instrument efectiu per al control del propi estudiant sobre el seu procés d’avaluació. Nivell 2: - Fer servir una metodologia d’activitats pràctiques on es puguin accentuar les capacitats de resposta creativa individual de l’estudiant. - Plantejar l’intercanvi d’experiències i propostes d’estratègies per a la resolució dels temes pràctics mitjançant sessions de seminaris prèvies a les pràctiques. - Promoure unes relacions transversals amb altres àrees de coneixement de tal manera que l’alumne pugui vincular la seva resposta a les activitats pràctiques proposades per altres assignatures, amb els coneixements adquirits a la nostra assignatura.