890 resultados para Minkowski metrics


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Due to the intense international competition, demanding, and sophisticated customers, and diverse transforming technological change, organizations need to renew their products and services by allocating resources on research and development (R&D). Managing R&D is complex, but vital for many organizations to survive in the dynamic, turbulent environment. Thus, the increased interest among decision-makers towards finding the right performance measures for R&D is understandable. The measures or evaluation methods of R&D performance can be utilized for multiple purposes; for strategic control, for justifying the existence of R&D, for providing information and improving activities, as well as for the purposes of motivating and benchmarking. The earlier research in the field of R&D performance analysis has generally focused on either the activities and considerable factors and dimensions - e.g. strategic perspectives, purposes of measurement, levels of analysis, types of R&D or phases of R&D process - prior to the selection of R&Dperformance measures, or on proposed principles or actual implementation of theselection or design processes of R&D performance measures or measurement systems. This study aims at integrating the consideration of essential factors anddimensions of R&D performance analysis to developed selection processes of R&D measures, which have been applied in real-world organizations. The earlier models for corporate performance measurement that can be found in the literature, are to some extent adaptable also to the development of measurement systemsand selecting the measures in R&D activities. However, it is necessary to emphasize the special aspects related to the measurement of R&D performance in a way that make the development of new approaches for especially R&D performance measure selection necessary: First, the special characteristics of R&D - such as the long time lag between the inputs and outcomes, as well as the overall complexity and difficult coordination of activities - influence the R&D performance analysis problems, such as the need for more systematic, objective, balanced and multi-dimensional approaches for R&D measure selection, as well as the incompatibility of R&D measurement systems to other corporate measurement systems and vice versa. Secondly, the above-mentioned characteristics and challenges bring forth the significance of the influencing factors and dimensions that need to be recognized in order to derive the selection criteria for measures and choose the right R&D metrics, which is the most crucial step in the measurement system development process. The main purpose of this study is to support the management and control of the research and development activities of organizations by increasing the understanding of R&D performance analysis, clarifying the main factors related to the selection of R&D measures and by providing novel types of approaches and methods for systematizing the whole strategy- and business-based selection and development process of R&D indicators.The final aim of the research is to support the management in their decision making of R&D with suitable, systematically chosen measures or evaluation methods of R&D performance. Thus, the emphasis in most sub-areas of the present research has been on the promotion of the selection and development process of R&D indicators with the help of the different tools and decision support systems, i.e. the research has normative features through providing guidelines by novel types of approaches. The gathering of data and conducting case studies in metal and electronic industry companies, in the information and communications technology (ICT) sector, and in non-profit organizations helped us to formulate a comprehensive picture of the main challenges of R&D performance analysis in different organizations, which is essential, as recognition of the most importantproblem areas is a very crucial element in the constructive research approach utilized in this study. Multiple practical benefits regarding the defined problemareas could be found in the various constructed approaches presented in this dissertation: 1) the selection of R&D measures became more systematic when compared to the empirical analysis, as it was common that there were no systematic approaches utilized in the studied organizations earlier; 2) the evaluation methods or measures of R&D chosen with the help of the developed approaches can be more directly utilized in the decision-making, because of the thorough consideration of the purpose of measurement, as well as other dimensions of measurement; 3) more balance to the set of R&D measures was desired and gained throughthe holistic approaches to the selection processes; and 4) more objectivity wasgained through organizing the selection processes, as the earlier systems were considered subjective in many organizations. Scientifically, this dissertation aims to make a contribution to the present body of knowledge of R&D performance analysis by facilitating dealing with the versatility and challenges of R&D performance analysis, as well as the factors and dimensions influencing the selection of R&D performance measures, and by integrating these aspects to the developed novel types of approaches, methods and tools in the selection processes of R&D measures, applied in real-world organizations. In the whole research, facilitation of dealing with the versatility and challenges in R&D performance analysis, as well as the factors and dimensions influencing the R&D performance measure selection are strongly integrated with the constructed approaches. Thus, the research meets the above-mentioned purposes and objectives of the dissertation from the scientific as well as from the practical point of view.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software engineering is criticized as not being engineering or 'well-developed' science at all. Software engineers seem not to know exactly how long their projects will last, what they will cost, and will the software work properly after release. Measurements have to be taken in software projects to improve this situation. It is of limited use to only collect metrics afterwards. The values of the relevant metrics have to be predicted, too. The predictions (i.e. estimates) form the basis for proper project management. One of the most painful problems in software projects is effort estimation. It has a clear and central effect on other project attributes like cost and schedule, and to product attributes like size and quality. Effort estimation can be used for several purposes. In this thesis only the effort estimation in software projects for project management purposes is discussed. There is a short introduction to the measurement issues, and some metrics relevantin estimation context are presented. Effort estimation methods are covered quite broadly. The main new contribution in this thesis is the new estimation model that has been created. It takes use of the basic concepts of Function Point Analysis, but avoids the problems and pitfalls found in the method. It is relativelyeasy to use and learn. Effort estimation accuracy has significantly improved after taking this model into use. A major innovation related to the new estimationmodel is the identified need for hierarchical software size measurement. The author of this thesis has developed a three level solution for the estimation model. All currently used size metrics are static in nature, but this new proposed metric is dynamic. It takes use of the increased understanding of the nature of the work as specification and design work proceeds. It thus 'grows up' along with software projects. The effort estimation model development is not possible without gathering and analyzing history data. However, there are many problems with data in software engineering. A major roadblock is the amount and quality of data available. This thesis shows some useful techniques that have been successful in gathering and analyzing the data needed. An estimation process is needed to ensure that methods are used in a proper way, estimates are stored, reported and analyzed properly, and they are used for project management activities. A higher mechanism called measurement framework is also introduced shortly. The purpose of the framework is to define and maintain a measurement or estimationprocess. Without a proper framework, the estimation capability of an organization declines. It requires effort even to maintain an achieved level of estimationaccuracy. Estimation results in several successive releases are analyzed. It isclearly seen that the new estimation model works and the estimation improvementactions have been successful. The calibration of the hierarchical model is a critical activity. An example is shown to shed more light on the calibration and the model itself. There are also remarks about the sensitivity of the model. Finally, an example of usage is shown.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Estimation of the dimensions of fluvial geobodies from core data is a notoriously difficult problem in reservoir modeling. To try and improve such estimates and, hence, reduce uncertainty in geomodels, data on dunes, unit bars, cross-bar channels, and compound bars and their associated deposits are presented herein from the sand-bed braided South Saskatchewan River, Canada. These data are used to test models that relate the scale of the formative bed forms to the dimensions of the preserved deposits and, therefore, provide an insight as to how such deposits may be preserved over geologic time. The preservation of bed-form geometry is quantified by comparing the Alluvial architecture above and below the maximum erosion depth of the modem channel deposits. This comparison shows that there is no significant difference in the mean set thickness of dune cross-strata above and below the basal erosion surface of the contemporary channel, thus suggesting that dimensional relationships between dune deposits and the formative bed-form dimensions are likely to be valid from both recent and older deposits. The data show that estimates of mean bankfull flow depth derived from dune, unit bar, and cross-bar channel deposits are all very similar. Thus, the use of all these metrics together can provide a useful check that all components and scales of the alluvial architecture have been identified correctly when building reservoir models. The data also highlight several practical issues with identifying and applying data relating to cross-strata. For example, the deposits of unit bars were found to be severely truncated in length and width, with only approximately 10% of the mean bar-form length remaining, and thus making identification in section difficult. For similar reasons, the deposits of compound bars were found to be especially difficult to recognize, and hence, estimates of channel depth based on this method may be problematic. Where only core data are available (i.e., no outcrop data exist), formative flow depths are suggested to be best reconstructed using cross-strata formed by dunes. However, theoretical relationships between the distribution of set thicknesses and formative dune height are found to result in slight overestimates of the latter and, hence, mean bankfull flow depths derived from these measurements. This article illustrates that the preservation of fluvial cross-strata and, thus, the paleohydraulic inferences that can be drawn from them, are a function of the ratio of the size and migration rate of bed forms and the time scale of aggradation and channel migration. These factors must thus be considered when deciding on appropriate length:thickness ratios for the purposes of object-based modeling in reservoir characterization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although usability evaluations have been focused on assessing different contexts of use, no proper specifications have been addressed towards the particular environment of academic websites in the Spanish-speaking context of use. Considering that this context involves hundreds of millions of potential users, the AIPO Association is running the UsabAIPO Project. The ultimate goal is to promote an adequate translation of international standards, methods and ideal values related to usability in order to adapt them to diverse Spanish-related contexts of use. This article presents the main statistical results coming from the Second and Third Stages of the UsabAIPO Project, where the UsabAIPO Heuristic method (based on Heuristic Evaluation techniques) and seven Cognitive Walkthroughs were performed over 69 university websites. The planning and execution of the UsabAIPO Heuristic method and the Cognitive Walkthroughs, the definition of two usability metrics, as well as the outline of the UsabAIPO Heuristic Management System prototype are also sketched.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We evaluated the performance of an optical camera based prospective motion correction (PMC) system in improving the quality of 3D echo-planar imaging functional MRI data. An optical camera and external marker were used to dynamically track the head movement of subjects during fMRI scanning. PMC was performed by using the motion information to dynamically update the sequence's RF excitation and gradient waveforms such that the field-of-view was realigned to match the subject's head movement. Task-free fMRI experiments on five healthy volunteers followed a 2×2×3 factorial design with the following factors: PMC on or off; 3.0mm or 1.5mm isotropic resolution; and no, slow, or fast head movements. Visual and motor fMRI experiments were additionally performed on one of the volunteers at 1.5mm resolution comparing PMC on vs PMC off for no and slow head movements. Metrics were developed to quantify the amount of motion as it occurred relative to k-space data acquisition. The motion quantification metric collapsed the very rich camera tracking data into one scalar value for each image volume that was strongly predictive of motion-induced artifacts. The PMC system did not introduce extraneous artifacts for the no motion conditions and improved the time series temporal signal-to-noise by 30% to 40% for all combinations of low/high resolution and slow/fast head movement relative to the standard acquisition with no prospective correction. The numbers of activated voxels (p<0.001, uncorrected) in both task-based experiments were comparable for the no motion cases and increased by 78% and 330%, respectively, for PMC on versus PMC off in the slow motion cases. The PMC system is a robust solution to decrease the motion sensitivity of multi-shot 3D EPI sequences and thereby overcome one of the main roadblocks to their widespread use in fMRI studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn tavoitteena oli tehdä kohde yritykselle ostoon, kotimaan myyntiin, varastointiin ja toimituksiin ISO 9002 -standardiin pohjautuva laatujärjestelmä, joka voidaan myöhemmin sertifioida. Valmis laatujärjestelmän kuvaus esitetään kirjallisessa muodossa laatukäsikirjan avulla. toisena tavoitteena oli tehdä laatujärjestelmään mittarit, jotka mahdollistava yrityksen laatujärjestelmän toimivuuden seurannan. Kolmantena tavoitteena oli saavuttaa laatujärjestelmän avulla taloudellisia säästöjä yrityksen toiminnassa pitkällä tähtäimellä. Tärkeä osa työtä oli laatia yrityksen laatukäsikirja ISO 9002 -standardin vaatimuksien mukaisesti. Työ jakautuu kahdeksaan suurempaan kokonaisuuteen: laatuun ja palveluihin liittyvien perusajatuksien määrittelyyn, tukkukauppatoiminnan kuvaamiseen, kohdeyrityksen esittelyyn, henkilökuntakyselyn suorittamiseen. laatujärjestelmien käsittelyyn, ISON 9002 -standardin vaatimuksien esittelyyn, kohden yrityksen laatukäsikirjan tarkasteluun ja saavutettujen tulosten tarkastelua ja pohdintaa. Työ tuloksena voidaan todeta, että laatujärjestelmän rakentaminen yritykselle on ollut hyvä ratkaisu. Laatujärjestelmän avulla on voitu saavuttaa säästöjä ja kehitysaskeleita. Investointi parempaan laatuun on kannattanut.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämä työ luo katsauksen ajallisiin ja stokastisiin ohjelmien luotettavuus malleihin sekä tutkii muutamia malleja käytännössä. Työn teoriaosuus sisältää ohjelmien luotettavuuden kuvauksessa ja arvioinnissa käytetyt keskeiset määritelmät ja metriikan sekä varsinaiset mallien kuvaukset. Työssä esitellään kaksi ohjelmien luotettavuusryhmää. Ensimmäinen ryhmä ovat riskiin perustuvat mallit. Toinen ryhmä käsittää virheiden ”kylvöön” ja merkitsevyyteen perustuvat mallit. Työn empiirinen osa sisältää kokeiden kuvaukset ja tulokset. Kokeet suoritettiin käyttämällä kolmea ensimmäiseen ryhmään kuuluvaa mallia: Jelinski-Moranda mallia, ensimmäistä geometrista mallia sekä yksinkertaista eksponenttimallia. Kokeiden tarkoituksena oli tutkia, kuinka syötetyn datan distribuutio vaikuttaa mallien toimivuuteen sekä kuinka herkkiä mallit ovat syötetyn datan määrän muutoksille. Jelinski-Moranda malli osoittautui herkimmäksi distribuutiolle konvergaatio-ongelmien vuoksi, ensimmäinen geometrinen malli herkimmäksi datan määrän muutoksille.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tässä työssä tutkitaan ohjelmistoarkkitehtuurisuunnitteluominaisuuksien vaikutusta erään client-server –arkkitehtuuriin perustuvan mobiilipalvelusovelluksen suunnittelu- ja toteutusaikaan. Kyseinen tutkimus perustuu reaalielämän projektiin, jonka kvalitatiivinen analyysi paljasti arkkitehtuurikompponenttien välisten kytkentöjen merkittävästi vaikuttavan projektin työmäärään. Työn päätavoite oli kvantitatiivisesti tutkia yllä mainitun havainnon oikeellisuus. Tavoitteen saavuttamiseksi suunniteltiin ohjelmistoarkkitehtuurisuunnittelun mittaristo kuvaamaan kyseisen järjestelmän alijärjestelmien arkkitehtuuria ja luotiin kaksi suunniteltua mittaristoa käyttävää, työmäärää (komponentin suunnittelu-, toteutus- ja testausaikojen summa) arvioivaa mallia, joista toinen on lineaarinen ja toinen epälineaarinen. Näiden mallien kertoimet sovitettiin optimoimalla niiden arvot epälineaarista gloobaalioptimointimenetelmää, differentiaalievoluutioalgoritmia, käyttäen, niin että mallien antamat arvot vastasivat parhaiten mitattua työmäärää sekä kaikilla ominaisuuksilla eli attribuuteilla että vain osalla niistä (yksi jätettiin vuorotellen pois). Kun arkkitehtuurikompenttien väliset kytkennät jätettiin malleista pois, mitattujen ja arvoitujen työmäärien välinen ero (ilmaistuna virheenä) kasvoi eräässä tapauksessa 367 % entisestä tarkoittaen sitä, että näin muodostettu malli vastasi toteutusaikoja huonosti annetulla ainestolla. Tämä oli suurin havaitu virhe kaikkien poisjätettyjen ominaisuuksien kesken. Saadun tuloksen perusteella päätettiin, että kyseisen järjestelmän toteutusajat ovat vahvasti riippuvaisia kytkentöjen määrästä, ja näin ollen kytkentöjen määrä oli mitä todennäköisemmin kaikista tärkein työmäärään vaikuttava tekijä tutkitun järjestelmän arkkitehtuurisuunnittelussa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Työn teoriaosassa käsitellään suorituskyvyn osa-alueita, suorituskyvyn mittaamista ja kunnossapitoa. Suorituskyvyn osa-alueiden avulla pyritään osoittamaan, että kunnossapidon kehittäminen kannattaa tuottavuuden kasvamisen ja kannattavuuden paranemisen johdosta. Suorituskyvyn mittaamisella ja sen lähtökohdilla luodaan mittarit, joilla pystytään havaitsemaan muutokset näissä osa-alueissa. Kunnossapidolla haetaan keinot, joilla suorituskyvyn osa-alueet kehittyvät. Käytännön osiossa kuvataan kohdeyrityksessä havaittuja tapahtumia kunnossapidon kehittämisessä ja mittariston käyttöönotossa. Tämän lisäksi tämä empiirinen osa kertoo tehdyistä toimenpiteistä tehtaalla. Empiirisen osion päättää jatkoehdotuslista projektin jatkamisesta tämän tutkimuksen jälkeen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Huntington's disease is an incurable neurodegenerative disease caused by inheritance of an expanded cytosine-adenine-guanine (CAG) trinucleotide repeat within the Huntingtin gene. Extensive volume loss and altered diffusion metrics in the basal ganglia, cortex and white matter are seen when patients with Huntington's disease (HD) undergo structural imaging, suggesting that changes in basal ganglia-cortical structural connectivity occur. The aims of this study were to characterise altered patterns of basal ganglia-cortical structural connectivity with high anatomical precision in premanifest and early manifest HD, and to identify associations between structural connectivity and genetic or clinical markers of HD. 3-Tesla diffusion tensor magnetic resonance images were acquired from 14 early manifest HD subjects, 17 premanifest HD subjects and 18 controls. Voxel-based analyses of probabilistic tractography were used to quantify basal ganglia-cortical structural connections. Canonical variate analysis was used to demonstrate disease-associated patterns of altered connectivity and to test for associations between connectivity and genetic and clinical markers of HD; this is the first study in which such analyses have been used. Widespread changes were seen in basal ganglia-cortical structural connectivity in early manifest HD subjects; this has relevance for development of therapies targeting the striatum. Premanifest HD subjects had a pattern of connectivity more similar to that of controls, suggesting progressive change in connections over time. Associations between structural connectivity patterns and motor and cognitive markers of disease severity were present in early manifest subjects. Our data suggest the clinical phenotype in manifest HD may be at least partly a result of altered connectivity. Hum Brain Mapp 36:1728-1740, 2015. © 2015 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tämä diplomityö tehtiin osana Componenta Cast Componentsin kolmivuotista toimitusketjujen kehitysprojektia. Työn tavoitteena oli kuvata tyypillinen yrityksen sisäinen toimitusketjuprosessi ja tehdä alustava suorituskykyanalyysi valimon ja konepajan väliseen logistiseen prosessiin liittyen. Tarkoituksena oli myös löytää kehityskohteita materiaali- ja tietovirtojen hallinnassa näiden tuotantoyksiköiden välillä. Logistiikkaan, toimitusketjujen hallintaan ja toimitusketjun suorituskyvyn mittaamiseen liittyvän kirjallisuustutkimuksen sekä käytännön perusteella valittiin sopivat analyysimenetelmät. Näitä menetelmiä hyödynnettiin tilaustoimitus – prosessin kuvaamisessa sekä suorituskyvyn analysoinnissa yrityksen sisäisessä toimitusketjussa. Luonnollisena jatkona kehitettiin ja pantiin käytäntöön toimitusketjua synkronoiva imutyyppinen tuotannon- ja materiaalinohjausmenetelmä. Diplomityöprojektin aikana kehitettiin myös apuvälineet käyttöönotetun menetelmän asianmukaista hyödyntämistä varten. Diplomityöprojektissa otettiin ensimmäiset askeleet kohti integroitua sisäistä toimitusketjua. Uuden tuotannon- ja materiaalinohjausmenetelmän standardisointi muihin menetelmiin yhdistettynä, sekä toimitusketjun avainmittarien jatkokehitys on jo alkanut. Läpimenoaikoja lyhentämällä ja synkronoidun, läpinäkyvän kysyntä-tarjontaketjun avulla integroitumisen astetta voidaan nostaa edelleen. Poikkiorganisatorinen kehitys ja johtaminen toimitusketjussa on avainedellytys menestykseen.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Utilitzem un conjunt de mètriques del paisatge per estudiar l'evolució a llarg termini seguida en una típica zona costanera del Mediterrani des de 1850 fins a 2005, que mostren una greu deterioració del medi ambient entre 1950 i 2005. Les principals forces motores d'aquesta degradació del paisatge han estat el creixement urbà experimentat a les antigues zones agrícoles situades a les planes litorals, juntament amb l'abandonament i la reforestació dels vessants dels pujols interceptats per àrees residencials de baixa densitat, carreteres i altres infraestructures lineals. Duem a terme una anàlisi estadística de redundància (RDA) amb la finalitat d'identificar els que considerem com alguns agents rectors socioeconòmics i polítics d'última instància d'aquests impactes ambientals. Els resultats confirmen les nostres hipòtesis interpretatives, que són que: 1) els canvis en les cobertes i usos del sòl determinen canvis en les propietats dels paisatge, tant estructurals com funcionals; 2) aquests canvis no es produeixen per atzar, sinó que estan relacionats amb factors geogràfics i forces socioeconòmiques i polítiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[cat] Utilitzem un conjunt de mètriques del paisatge per estudiar l'evolució a llarg termini seguida en una típica zona costanera del Mediterrani des de 1850 fins a 2005, que mostren una greu deterioració del medi ambient entre 1950 i 2005. Les principals forces motores d'aquesta degradació del paisatge han estat el creixement urbà experimentat a les antigues zones agrícoles situades a les planes litorals, juntament amb l'abandonament i la reforestació dels vessants dels pujols interceptats per àrees residencials de baixa densitat, carreteres i altres infraestructures lineals. Duem a terme una anàlisi estadística de redundància (RDA) amb la finalitat d'identificar els que considerem com alguns agents rectors socioeconòmics i polítics d'última instància d'aquests impactes ambientals. Els resultats confirmen les nostres hipòtesis interpretatives, que són que: 1) els canvis en les cobertes i usos del sòl determinen canvis en les propietats dels paisatge, tant estructurals com funcionals; 2) aquests canvis no es produeixen per atzar, sinó que estan relacionats amb factors geogràfics i forces socioeconòmiques i polítiques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diplomityö tarkastelee säikeistettyä ohjelmointia rinnakkaisohjelmoinnin ylemmällä hierarkiatasolla tarkastellen erityisesti hypersäikeistysteknologiaa. Työssä tarkastellaan hypersäikeistyksen hyviä ja huonoja puolia sekä sen vaikutuksia rinnakkaisalgoritmeihin. Työn tavoitteena oli ymmärtää Intel Pentium 4 prosessorin hypersäikeistyksen toteutus ja mahdollistaa sen hyödyntäminen, missä se tuo suorituskyvyllistä etua. Työssä kerättiin ja analysoitiin suorituskykytietoa ajamalla suuri joukko suorituskykytestejä eri olosuhteissa (muistin käsittely, kääntäjän asetukset, ympäristömuuttujat...). Työssä tarkasteltiin kahdentyyppisiä algoritmeja: matriisioperaatioita ja lajittelua. Näissä sovelluksissa on säännöllinen muistinkäyttökuvio, mikä on kaksiteräinen miekka. Se on etu aritmeettis-loogisissa prosessoinnissa, mutta toisaalta huonontaa muistin suorituskykyä. Syynä siihen on nykyaikaisten prosessorien erittäin hyvä raaka suorituskyky säännöllistä dataa käsiteltäessä, mutta muistiarkkitehtuuria rajoittaa välimuistien koko ja useat puskurit. Kun ongelman koko ylittää tietyn rajan, todellinen suorituskyky voi pudota murto-osaan huippusuorituskyvystä.