899 resultados para ree software environment for statistical computing and graphics R
Resumo:
France amended its constitution in 2005 to include a Charter for the Environment. The Charter lays out France's commitment to supporting the right to a 'balanced environment'. This article first traces the Charter's origins to a legacy-building presidential initiative. Jacques Chirac decided to invest in a neglected policy domain in which his own majority had shown little interest. He was obliged to intervene repeatedly in order to bring this project to a successful conclusion. In doing so, he staked out environmental affairs as an area of potential presidential supremacy. Next, the content of the Charter is examined. In this document, French traditions of universalism come together with an international movement for anticipatory environmental protection. This is reflected in the constitutionalisation of the precautionary principle, which emerged as the most controversial part of the Charter. The debates this provoked tended to caricature a risk-management principle whose meaning has been carefully refined to forestall objections. Finally, the Charter's potential efficacy is analysed. The post-Charter record of legislative and judicial activity concerning the environment is meagre, but not wholly inauspicious.
Resumo:
The objective of this study was to evaluate the efficiency of spatial statistical analysis in the selection of genotypes in a plant breeding program and, particularly, to demonstrate the benefits of the approach when experimental observations are not spatially independent. The basic material of this study was a yield trial of soybean lines, with five check varieties (of fixed effect) and 110 test lines (of random effects), in an augmented block design. The spatial analysis used a random field linear model (RFML), with a covariance function estimated from the residuals of the analysis considering independent errors. Results showed a residual autocorrelation of significant magnitude and extension (range), which allowed a better discrimination among genotypes (increase of the power of statistical tests, reduction in the standard errors of estimates and predictors, and a greater amplitude of predictor values) when the spatial analysis was applied. Furthermore, the spatial analysis led to a different ranking of the genetic materials, in comparison with the non-spatial analysis, and a selection less influenced by local variation effects was obtained.
Resumo:
This report presents the results of a comparative laboratory study between well- and gap-graded aggregates used in asphalt concrete paving mixtures. A total of 424 batches of asphalt concrete mixtures and 3, 960 Marshall and Hveem specimens were examined. The main thrust of the statistical analysis conducted in this experiment was in the calibration study and in Part I of the experiment. In the former study, the compaction procedure between the Iowa State University Lab and the Iowa Highway Commission Lab was calibrated. By an analysis of the errors associated with the measurements we were able to separate the "preparation" and "determination" errors for both laboratories as well as develop the calibration curve which describes the relationship between the compaction procedures at the two labs. In Part I, the use of a fractional factorial design in a split plot experiment in measuring the effect of several factors on asphalt concrete strength and weight was exhibited. Also, the use of half normal plotting techniques for indicating significant factors and interactions and for estimating errors in experiments with only a limited number of observations was outlined,
Resumo:
The use of open source software continues to grow on a daily basis. Today, enterprise applications contain 40% to 70% open source code and this fact has legal, development, IT security, risk management and compliance organizations focusing their attention on its use, as never before. They increasingly understand that the open source content within an application must be detected. Once uncovered, decisions regarding compliance with intellectual property licensing obligations must be made and known security vulnerabilities must be remediated. It is no longer sufficient from a risk perspective to not address both open source issues.
Resumo:
This paper aims to better understand the development of students’ learning processes when participating actively in a specific Computer Supported Collaborative Learning system called KnowCat. To this end, a longitudinal case study was designed, in which eighteen university students took part in a 12-month (two semesters) learning project. During this time period, the students followed an instructional process, using some elements of KnowCat (KnowCat key features) design to support and improve their interaction processes, especially peer learning processes. Our research involved both supervising the students’ collaborative learning processes throughout the learning project and focusing our analysis on the qualitative evolution of the students’ interaction processes and on the development of metacognitive learning processes. The results of the current research reveal that the instructional application of the CSCL-KnowCat system may favour and improve the development of the students’ metacognitive learning processes. Additionally, the implications of the design of computer supported collaborative learning networks and pedagogical issues are discussed in this paper.
Resumo:
In this research work we searched for open source libraries which supports graph drawing and visualisation and can run in a browser. Subsequent these libraries were evaluated to find out which one is the best for this task. The result was the d3.js is that library which has the greatest functionality, flexibility and customisability. Afterwards we developed an open source software tool where d3.js was included and which was written in JavaScript so that it can run browser-based.
Resumo:
The question of Pilot Project creation, due to support pre-development stage of software product elaboration, nowadays might be used as an approach, which allows improving the whole scheme of information technology project running. This subject is not new, but till now no model has been presented, which gives deep description of this important stage on the early phase of project. This Master's Thesis represents the research's results and findings concerning the pre-development study from the Software Engineering point of view. The aspects of feasibility study, pilot prototype developments are analyzed in this paper. As the result, the technique of Pilot Project is formulated and scheme has been presented. The experimental part is focused on particular area Pilot Project scheme's implementation- Internationally Distributed Software projects. The specific characteristic, aspects, obstacles, advantages and disadvantages are considered on the example of cross border region of Russia and Finland. The real case of Pilot Project technique implementation is given.
Resumo:
Vaatimusmäärittelyn tavoitteena on luoda halutun järjestelmän kokonaisen, yhtenäisen vaatimusluettelon vaatimusten määrittämiseksi käsitteellisellä tasolla. Liiketoimintaprosessien mallintaminen on varsin hyödyllinen vaatimusmäärittelyn varhaisissa vaiheissa. Tämä työ tutkii liiketoimintaprosessien mallintamista tietojärjestelmien kehittämistä varten. Nykyään on olemassa erilaisia liiketoimintaprosessien mallintamiseen tarkoitettuja tekniikoita. Tämä työ tarkastaa liiketoimintaprosessien mallintamisen periaatteet ja näkökohdat sekä eri mallinnustekniikoita. Uusi menetelmä, joka on suunniteltu erityisesti pienille ja keskisuurille ohjelmistoprojekteille, on kehitetty prosessinäkökohtien ja UML-kaavioiden perusteella.
Resumo:
Plants must constantly adapt to a changing light environment in order to optimize energy conversion through the process of photosynthesis and to limit photodamage. In addition, plants use light cues for timing of key developmental transitions such as initiation of reproduction (transition to flowering). Plants are equipped with a battery of photoreceptors enabling them to sense a very broad light spectrum spanning from UV-B to far-red wavelength (280-750nm). In this review we briefly describe the different families of plant photosensory receptors and the mechanisms by which they transduce environmental information to influence numerous aspects of plant growth and development throughout their life cycle.
Resumo:
Thegoalofthepresentreviewistoexplainhowimmersivevirtualenvironmenttechnology(IVET)canbeusedforthestudyofsocialinteractionsandhowtheuseofvirtualhumansinimmersivevirtualenvironmentscanadvanceresearchandapplicationinmanydifferentfields.Researchersstudyingindividualdifferencesinsocialinteractionsaretypicallyinterestedinkeepingthebehaviorandtheappearanceoftheinteractionpartnerconstantacrossparticipants.WithIVETresearchershavefullcontrolovertheinteractionpartners,canstandardizethemwhilestillkeepingthesimulationrealistic.Virtualsimulationsarevalid:growingevidenceshowsthatindeedstudiesconductedwithIVETcanreplicatesomewell-knownfindingsofsocialpsychology.Moreover,IVETallowsresearcherstosubtlymanipulatecharacteristicsoftheenvironment(e.g.,visualcuestoprimeparticipants)orofthesocialpartner(e.g.,his/herrace)toinvestigatetheirinfluencesonparticipants'behaviorandcognition.Furthermore,manipulationsthatwouldbedifficultorimpossibleinreallife(e.g.,changingparticipants'height)canbeeasilyobtainedwithIVET.Besidetheadvantagesfortheoreticalresearch,weexplorethemostrecenttrainingandclinicalapplicationsofIVET,itsintegrationwithothertechnologies(e.g.,socialsensing)andfuturechallengesforresearchers(e.g.,makingthecommunicationbetweenvirtualhumansandparticipantssmoother).
Resumo:
The thesis studies role based access control and its suitability in the enterprise environment. The aim is to research how extensively role based access control can be implemented in the case organization and how it support organization’s business and IT functions. This study points out the enterprise’s needs for access control, factors of access control in the enterprise environment and requirements for implementation and the benefits and challenges it brings along. To find the scope how extensively role based access control can be implemented into the case organization, firstly is examined the actual state of access control. Secondly is defined a rudimentary desired state (how things should be) and thirdly completed it by using the results of the implementation of role based access control application. The study results the role model for case organization unit, and the building blocks and the framework for the organization wide implementation. Ultimate value for organization is delivered by facilitating the normal operations of the organization whilst protecting its information assets.
Resumo:
The objective of the thesis is to enhance the understanding about the management of the front end phases of the innovation process in a networked environment. The thesis approaches the front end of innovation from three perspectives, including the strategy, processes and systems of innovation. The purpose of the use of different perspectives in the thesis is that of providing an extensive systemic view of the front end, and uncovering the complex nature of innovation management. The context of the research is the networked operating environment of firms. The unit of analysis is the firm itself or its innovation processes, which means that this research approaches the innovation networks from the point of view of a firm. The strategy perspective of the thesis emphasises the importance of purposeful innovation management, the innovation strategy of firms. The role of innovation processes is critical in carrying out innovation strategies in practice, supporting the development of organizational routines for innovation, and driving the strategic renewal of companies. The primary focus of the thesis from systems perspective is on idea management systems, which are defined as a part of innovation management systems, and defined for this thesis as any working combination of methodology and tools (manual or IT-supported) that enhance the management of innovations within their early phases. The main contribution of the thesis are the managerial frameworks developed for managing the front end of innovation, which purposefully “wire” the front end of innovation into the strategy and business processes of a firm. The thesis contributes to modern innovation management by connecting the internal and external collaboration networks as foundational elements for successful management of the early phases of innovation processes in a dynamic environment. The innovation capability of a firm is largely defined by its ability to rely on and make use of internal and external collaboration already during the front end activities, which by definition include opportunity identification and analysis, idea generation, profileration and selection, and concept definition. More specifically, coordination of the interfaces between these activities, and between the internal and external innovation environments of a firm is emphasised. The role of information systems, in particular idea management systems, is to support and delineate the innovation-oriented behaviour and interaction of individuals and organizations during front end activities. The findings and frameworks developed in the thesis can be used by companies for purposeful promotion of their front end processes. The thesis provides a systemic strategy framework for managing the front end of innovation – not as a separate process, but as an elemental bundle ofactivities that is closely linked to the overall innovation process and strategy of a firm in a distributed environment. The theoretical contribution of the thesis relies on the advancement of the open innovation paradigm in the strategic context of a firm within its internal and external innovation environments. This thesis applies the constructive research approach and case study methodology to provide theoretically significant results, which are also practically beneficial.
Resumo:
The identifiability of the parameters of a heat exchanger model without phase change was studied in this Master’s thesis using synthetically made data. A fast, two-step Markov chain Monte Carlo method (MCMC) was tested with a couple of case studies and a heat exchanger model. The two-step MCMC-method worked well and decreased the computation time compared to the traditional MCMC-method. The effect of measurement accuracy of certain control variables to the identifiability of parameters was also studied. The accuracy used did not seem to have a remarkable effect to the identifiability of parameters. The use of the posterior distribution of parameters in different heat exchanger geometries was studied. It would be computationally most efficient to use the same posterior distribution among different geometries in the optimisation of heat exchanger networks. According to the results, this was possible in the case when the frontal surface areas were the same among different geometries. In the other cases the same posterior distribution can be used for optimisation too, but that will give a wider predictive distribution as a result. For condensing surface heat exchangers the numerical stability of the simulation model was studied. As a result, a stable algorithm was developed.
Resumo:
Dagens programvaruindustri står inför alltmer komplicerade utmaningar i en värld där programvara är nästan allstädes närvarande i våra dagliga liv. Konsumenten vill ha produkter som är pålitliga, innovativa och rika i funktionalitet, men samtidigt också förmånliga. Utmaningen för oss inom IT-industrin är att skapa mer komplexa, innovativa lösningar till en lägre kostnad. Detta är en av orsakerna till att processförbättring som forskningsområde inte har minskat i betydelse. IT-proffs ställer sig frågan: “Hur håller vi våra löften till våra kunder, samtidigt som vi minimerar vår risk och ökar vår kvalitet och produktivitet?” Inom processförbättringsområdet finns det olika tillvägagångssätt. Traditionella processförbättringsmetoder för programvara som CMMI och SPICE fokuserar på kvalitets- och riskaspekten hos förbättringsprocessen. Mer lättviktiga metoder som t.ex. lättrörliga metoder (agile methods) och Lean-metoder fokuserar på att hålla löften och förbättra produktiviteten genom att minimera slöseri inom utvecklingsprocessen. Forskningen som presenteras i denna avhandling utfördes med ett specifikt mål framför ögonen: att förbättra kostnadseffektiviteten i arbetsmetoderna utan att kompromissa med kvaliteten. Den utmaningen attackerades från tre olika vinklar. För det första förbättras arbetsmetoderna genom att man introducerar lättrörliga metoder. För det andra bibehålls kvaliteten genom att man använder mätmetoder på produktnivå. För det tredje förbättras kunskapsspridningen inom stora företag genom metoder som sätter samarbete i centrum. Rörelsen bakom lättrörliga arbetsmetoder växte fram under 90-talet som en reaktion på de orealistiska krav som den tidigare förhärskande vattenfallsmetoden ställde på IT-branschen. Programutveckling är en kreativ process och skiljer sig från annan industri i det att den största delen av det dagliga arbetet går ut på att skapa något nytt som inte har funnits tidigare. Varje programutvecklare måste vara expert på sitt område och använder en stor del av sin arbetsdag till att skapa lösningar på problem som hon aldrig tidigare har löst. Trots att detta har varit ett välkänt faktum redan i många decennier, styrs ändå många programvaruprojekt som om de vore produktionslinjer i fabriker. Ett av målen för rörelsen bakom lättrörliga metoder är att lyfta fram just denna diskrepans mellan programutvecklingens innersta natur och sättet på vilket programvaruprojekt styrs. Lättrörliga arbetsmetoder har visat sig fungera väl i de sammanhang de skapades för, dvs. små, samlokaliserade team som jobbar i nära samarbete med en engagerad kund. I andra sammanhang, och speciellt i stora, geografiskt utspridda företag, är det mera utmanande att införa lättrörliga metoder. Vi har nalkats utmaningen genom att införa lättrörliga metoder med hjälp av pilotprojekt. Detta har två klara fördelar. För det första kan man inkrementellt samla kunskap om metoderna och deras samverkan med sammanhanget i fråga. På så sätt kan man lättare utveckla och anpassa metoderna till de specifika krav som sammanhanget ställer. För det andra kan man lättare överbrygga motstånd mot förändring genom att introducera kulturella förändringar varsamt och genom att målgruppen får direkt förstahandskontakt med de nya metoderna. Relevanta mätmetoder för produkter kan hjälpa programvaruutvecklingsteam att förbättra sina arbetsmetoder. När det gäller team som jobbar med lättrörliga och Lean-metoder kan en bra uppsättning mätmetoder vara avgörande för beslutsfattandet när man prioriterar listan över uppgifter som ska göras. Vårt fokus har legat på att stöda lättrörliga och Lean-team med interna produktmätmetoder för beslutsstöd gällande så kallad omfaktorering, dvs. kontinuerlig kvalitetsförbättring av programmets kod och design. Det kan vara svårt att ta ett beslut att omfaktorera, speciellt för lättrörliga och Lean-team, eftersom de förväntas kunna rättfärdiga sina prioriteter i termer av affärsvärde. Vi föreslår ett sätt att mäta designkvaliteten hos system som har utvecklats med hjälp av det så kallade modelldrivna paradigmet. Vi konstruerar även ett sätt att integrera denna mätmetod i lättrörliga och Lean-arbetsmetoder. En viktig del av alla processförbättringsinitiativ är att sprida kunskap om den nya programvaruprocessen. Detta gäller oavsett hurdan process man försöker introducera – vare sig processen är plandriven eller lättrörlig. Vi föreslår att metoder som baserar sig på samarbete när processen skapas och vidareutvecklas är ett bra sätt att stöda kunskapsspridning på. Vi ger en översikt över författarverktyg för processer på marknaden med det förslaget i åtanke.