100 resultados para large infrastructure


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viri is a system for automatic distribution and execution of Python code on remote machines. This is especially useful when dealing with a large group of hosts.With Viri, Sysadmins can write their own scripts, and easily distribute and execute them on any number of remote machines. Depending on the number of computers to administrate, Viri can save thousands of hours, that Sysadmins would spend transferring files, logging into remote hosts, and waiting for the scripts to finish. Viri automates the whole process.Viri can also be useful for remotely managing host settings. It should work together with an application where the information about hosts would be maintained. This information can include cron tasks, firewall rules, backup settings,... After a simple Integration of this application with your Viri infrastructure, you can change any settings in the application, and see how it gets applied on the target host automatically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large projects evaluation rises well known difficulties because -by definition- they modify the current price system; their public evaluation presents additional difficulties because they modify too existing shadow prices without the project. This paper analyzes -first- the basic methodologies applied until late 80s., based on the integration of projects in optimization models or, alternatively, based on iterative procedures with information exchange between two organizational levels. New methodologies applied afterwards are based on variational inequalities, bilevel programming and linear or nonlinear complementarity. Their foundations and different applications related with project evaluation are explored. As a matter of fact, these new tools are closely related among them and can treat more complex cases involving -for example- the reaction of agents to policies or the existence of multiple agents in an environment characterized by common functions representing demands or constraints on polluting emissions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En els últims 30 anys, el número d’incendis patits a Galicia han augmentat de manera important. En el Massís Central Ourensano l’home ha utilitzat el foc com a eina de gestió forestal per tal de permetre pasturar al bestiar i per tal de recuperar terres per a pastos i culius. Aquesta pràctica ha generat grans extensions de matollar sec europeu on hi hauria d’haver formacions boscoses. Partim de la necessitat de recuperar zones de bosc i ecosistema original i per això mesurem el comportament de les diferents espècies que trobem al matollar y al bosc en aquestes altituds davant la pertorbació que suposen els incendis recurrents. Per realitzar això s’ha mostrejat la vegetació de zones cremades en moments diferents o repetides vegades, mesurant superfície i alçades així com nombre d’individus de cada espècie. També s’han analitzat perfils del sòl per tal de conèixer amb més detall les característiques de cada zona. S’ha observat mitjançant el mostreig com per a la recuperació del matollar el factor determinant és el temps, encara que no trobem un sòl de bona qualitat i profund, en una mitjana de 8 anys trobem un matollar ben desenvolupat amb una bona diversitat d’espècies i grau de cobertura. En canvi, per tal d’arribar a un estat de la successió vegetal on trobem un bosc és necessari que existeixin comunitats arbòries a prop per tal de que arribin individus al matollar desenvolupat. Cal aleshores treballar en l’educació de la població i en la cerca d’alternatives a la gestió forestal vigent, donant èmfasi en la valoració econòmica dels ecosistemes en bon estat i facilitant que aquest bon estat proporcioni beneficis a la població local. Per això cal generar una infraestructura per atraure un turisme rural respectuós amb el medi al mateix temps que es duen altres iniciatives com la implantació de centrals de biomassa als pobles que puguin proporcionar calefacció o aigua calenta. Generant llocs de feina i estalvis a la població d’una zona on l’economia encara es basa potencialment en la ramaderia. Al mateix temps l’esforç monetari dedicat a les plantacions ha de dedicar-se a generar espais al territori amb espècies autòctones com el roure en aquells matollars que presentin condicions adients per a recuperar el bosc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Globalization involves several facility location problems that need to be handled at large scale. Location Allocation (LA) is a combinatorial problem in which the distance among points in the data space matter. Precisely, taking advantage of the distance property of the domain we exploit the capability of clustering techniques to partition the data space in order to convert an initial large LA problem into several simpler LA problems. Particularly, our motivation problem involves a huge geographical area that can be partitioned under overall conditions. We present different types of clustering techniques and then we perform a cluster analysis over our dataset in order to partition it. After that, we solve the LA problem applying simulated annealing algorithm to the clustered and non-clustered data in order to work out how profitable is the clustering and which of the presented methods is the most suitable

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cobre Las Cruces is a renowned copper mining company located in Sevilla, with unexpected problems in wireless communications that have a direct affectation in production. Therefore, the main goals are to improve the WiFi infrastructure, to secure it and to detect and prevent from attacks and from the installation of rogue (and non-authorized) APs. All of that integrated with the current ICT infrastructure.This project has been divided into four phases, although only two of them have been included into the TFC; they are the analysis of the current situation and the design of a WLAN solution.Once the analysis part was finished, some weaknesses were detected. Subjects such as lack of connectivity and control, ignorance about installed WiFi devices and their localization and state and, by and large, the use of weak security mechanisms were some of the problems found. Additionally, due to the fact that the working area became larger and new WiFi infrastructures were added, the first phase took more time than expected.As a result of the detailed analysis, some goals were defined to solve and it was designed a centralized approach able to cope with them. A solution based on 802.11i and 802.1x protocols, digital certificates, a probe system running as IDS/IPS and ligthweight APs in conjunction with a Wireless LAN Controller are the main features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A general reduced dimensionality finite field nuclear relaxation method for calculating vibrational nonlinear optical properties of molecules with large contributions due to anharmonic motions is introduced. In an initial application to the umbrella (inversion) motion of NH3 it is found that difficulties associated with a conventional single well treatment are overcome and that the particular definition of the inversion coordinate is not important. Future applications are described

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A select-divide-and-conquer variational method to approximate configuration interaction (CI) is presented. Given an orthonormal set made up of occupied orbitals (Hartree-Fock or similar) and suitable correlation orbitals (natural or localized orbitals), a large N-electron target space S is split into subspaces S0,S1,S2,...,SR. S0, of dimension d0, contains all configurations K with attributes (energy contributions, etc.) above thresholds T0={T0egy, T0etc.}; the CI coefficients in S0 remain always free to vary. S1 accommodates KS with attributes above T1≤T0. An eigenproblem of dimension d0+d1 for S0+S 1 is solved first, after which the last d1 rows and columns are contracted into a single row and column, thus freezing the last d1 CI coefficients hereinafter. The process is repeated with successive Sj(j≥2) chosen so that corresponding CI matrices fit random access memory (RAM). Davidson's eigensolver is used R times. The final energy eigenvalue (lowest or excited one) is always above the corresponding exact eigenvalue in S. Threshold values {Tj;j=0, 1, 2,...,R} regulate accuracy; for large-dimensional S, high accuracy requires S 0+S1 to be solved outside RAM. From there on, however, usually a few Davidson iterations in RAM are needed for each step, so that Hamiltonian matrix-element evaluation becomes rate determining. One μhartree accuracy is achieved for an eigenproblem of order 24 × 106, involving 1.2 × 1012 nonzero matrix elements, and 8.4×109 Slater determinants

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many terrestrial and marine systems are experiencing accelerating decline due to the effects of global change. This situation has raised concern about the consequences of biodiversity losses for ecosystem function, ecosystem service provision, and human well-being. Coastal marine habitats are a main focus of attention because they harbour a high biological diversity, are among the most productive systems of the world and present high anthropogenic interaction levels. The accelerating degradation of many terrestrial and marine systems highlights the urgent need to evaluate the consequence of biodiversity loss. Because marine biodiversity is a dynamic entity and this study was interested global change impacts, this study focused on benthic biodiversity trends over large spatial and long temporal scales. The main aim of this project was to investigate the current extent of biodiversity of the high diverse benthic coralligenous community in the Mediterranean Sea, detect its changes, and predict its future changes over broad spatial and long temporal scales. These marine communities are characterized by structural species with low growth rates and long life spans; therefore they are considered particularly sensitive to disturbances. For this purpose, this project analyzed permanent photographic plots over time at four locations in the NW Mediterranean Sea. The spatial scale of this study provided information on the level of species similarity between these locations, thus offering a solid background on the amount of large scale variability in coralligenous communities; whereas the temporal scale was fundamental to determine the natural variability in order to discriminate between changes observed due to natural factors and those related to the impact of disturbances (e.g. mass mortality events related to positive thermal temperatures, extreme catastrophic events). This study directly addressed the challenging task of analyzing quantitative biodiversity data of these high diverse marine benthic communities. Overall, the scientific knowledge gained with this research project will improve our understanding in the function of marine ecosystems and their trajectories related to global change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the first useful products from the human genome will be a set of predicted genes. Besides its intrinsic scientific interest, the accuracy and completeness of this data set is of considerable importance for human health and medicine. Though progress has been made on computational gene identification in terms of both methods and accuracy evaluation measures, most of the sequence sets in which the programs are tested are short genomic sequences, and there is concern that these accuracy measures may not extrapolate well to larger, more challenging data sets. Given the absence of experimentally verified large genomic data sets, we constructed a semiartificial test set comprising a number of short single-gene genomic sequences with randomly generated intergenic regions. This test set, which should still present an easier problem than real human genomic sequence, mimics the approximately 200kb long BACs being sequenced. In our experiments with these longer genomic sequences, the accuracy of GENSCAN, one of the most accurate ab initio gene prediction programs, dropped significantly, although its sensitivity remained high. Conversely, the accuracy of similarity-based programs, such as GENEWISE, PROCRUSTES, and BLASTX was not affected significantly by the presence of random intergenic sequence, but depended on the strength of the similarity to the protein homolog. As expected, the accuracy dropped if the models were built using more distant homologs, and we were able to quantitatively estimate this decline. However, the specificities of these techniques are still rather good even when the similarity is weak, which is a desirable characteristic for driving expensive follow-up experiments. Our experiments suggest that though gene prediction will improve with every new protein that is discovered and through improvements in the current set of tools, we still have a long way to go before we can decipher the precise exonic structure of every gene in the human genome using purely computational methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report presents systematic empirical annotation of transcript products from 399 annotated protein-coding loci across the 1% of the human genome targeted by the Encyclopedia of DNA elements (ENCODE) pilot project using a combination of 5' rapid amplification of cDNA ends (RACE) and high-density resolution tiling arrays. We identified previously unannotated and often tissue- or cell-line-specific transcribed fragments (RACEfrags), both 5' distal to the annotated 5' terminus and internal to the annotated gene bounds for the vast majority (81.5%) of the tested genes. Half of the distal RACEfrags span large segments of genomic sequences away from the main portion of the coding transcript and often overlap with the upstream-annotated gene(s). Notably, at least 20% of the resultant novel transcripts have changes in their open reading frames (ORFs), most of them fusing ORFs of adjacent transcripts. A significant fraction of distal RACEfrags show expression levels comparable to those of known exons of the same locus, suggesting that they are not part of very minority splice forms. These results have significant implications concerning (1) our current understanding of the architecture of protein-coding genes; (2) our views on locations of regulatory regions in the genome; and (3) the interpretation of sequence polymorphisms mapping to regions hitherto considered to be "noncoding," ultimately relating to the identification of disease-related sequence alterations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes a pilot study centred on the technology-enhanced self-development of competences in lifelong learning education carried out in the challenging context of the Association of Participants Àgora. The pilot study shows that the use of the TENCompetence infrastructure, i.e. in this case the Personal Development Planner tool, provides various kinds of benefits for adult participants with low educational profiles and who are traditionally excluded from the use of innovative learning technologies and the knowledge society. The selforganized training supported by the PDP tool aims at allowing the learners to create and control their own learning plans based on their interests and educational background including informal and non-formal experiences. In this sense, the pilot participants had the opportunity to develop and improve their competences in English language (basic and advanced levels) and ICT competence profiles which are mostly related to functional and communicative skills. Besides, the use of the PDP functionalities, such as the self-assessment, the planning and the self-regulating elements allowed the participants to develop reflective skills. Pilot results also provide indications for future developments in the field of technology support for self-organized learners. The paper introduces the context and the pilot scenario, indicates the evaluation methodology applied and discusses the most significant findings derived from the pilot study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes an experiment to explore the effects of the TENCompetence infrastructure for supporting lifelong competence development which is now in development. This infrastructure provides structured, multi-leveled access to learning materials, based upon competences. People can follow their own learning path, supported by a listing of competences and their components, by competence development plans attached to competences and by the possibility to mark elements as complete. We expected the PCM to have an effect on (1) control of participants of their own learning, and (2) appreciation of their learning route, (3) of the learning resources, (4) of their competence development, and (5) of the possibilities of collaboration. In the experiment, 44 Bulgarian teachers followed a distance learning course on a specific teaching methodology for six weeks. Part of them used the TENCompetence infrastructure, part used an infrastructure which was similar, except for the characterizing elements mentioned above. The results showed that in the experimental condition, more people passed the final competence assess-ment, and people felt more in control of their own learning. No differences between the two groups were found on the amount and appreciation of collaboration and on further measures of competence development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the minimum mean square error (MMSE) and the multiuser efficiency η of large dynamic multiple access communication systems in which optimal multiuser detection is performed at the receiver as the number and the identities of active users is allowed to change at each transmission time. The system dynamics are ruled by a Markov model describing the evolution of the channel occupancy and a large-system analysis is performed when the number of observations grow large. Starting on the equivalent scalar channel and the fixed-point equation tying multiuser efficiency and MMSE, we extend it to the case of a dynamic channel, and derive lower and upper bounds for the MMSE (and, thus, for η as well) holding true in the limit of large signal–to–noise ratios and increasingly large observation time T.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Large law firms seem to prefer hourly fees over contingent fees. Thispaper provides a moral hazard explanation for this pattern of behavior.Contingent legal fees align the interests of the attorney with those ofthe client, but not necessarily with those of the partnership. We showthat the choice of hourly fees is a solution to an agency problem withmultiple principals, where the interests of one principal (law firm)collide with the interests of the other principal (client).