919 resultados para Many-to-many-assignment problem
Resumo:
Institutional digital repositories are a basic piece to provide preservation and reutilization of learning resources. However, their creation and maintenance is usually performed following a top-down approach, causing limitations in the search and reutilization of learning resources. In order to avoid this problem we propose to use web 2.0 functionalities. In this paper we present how tagging can be used to enhance the search and reusability functionalities of institutional learning repositories as well as promoting their usage. The paper also describes the evaluation process that was performed in a pilot experience involving open educational resources.
Resumo:
In this paper we propose a method for computing JPEG quantization matrices for a given mean square error or PSNR. Then, we employ our method to compute JPEG standard progressive operation mode definition scripts using a quantization approach. Therefore, it is no longer necessary to use a trial and error procedure to obtain a desired PSNR and/or definition script, reducing cost. Firstly, we establish a relationship between a Laplacian source and its uniform quantization error. We apply this model to the coefficients obtained in the discrete cosine transform stage of the JPEG standard. Then, an image may be compressed using the JPEG standard under a global MSE (or PSNR) constraint and a set of local constraints determined by the JPEG standard and visual criteria. Secondly, we study the JPEG standard progressive operation mode from a quantization based approach. A relationship between the measured image quality at a given stage of the coding process and a quantization matrix is found. Thus, the definition script construction problem can be reduced to a quantization problem. Simulations show that our method generates better quantization matrices than the classical method based on scaling the JPEG default quantization matrix. The estimation of PSNR has usually an error smaller than 1 dB. This figure decreases for high PSNR values. Definition scripts may be generated avoiding an excessive number of stages and removing small stages that do not contribute during the decoding process with a noticeable image quality improvement.
Resumo:
Annualising work hours (AH) is a means of achievement flexibility in the use of human resources to face the seasonal nature of demand. In Corominas et al. (1) two MILP models are used to solve the problem of planning staff working hours with annual horizon. The costs due to overtime and to the employment of temporary workers are minimised, and the distribution of working time over the course of the year for each worker and the distribution of working time provided by temporary workers are regularised.In the aforementioned paper, the following is assumed: (i) the holiday weeks are fixed a priori and (ii) the workers are from different categories who are able to perform specific type of task have se same efficiency; moreover, the values of the binary variables (and others) in the second model are fixed to those in the first model (thus, in the second model these will intervene as constants and not as variables, resulting in an LP model).In the present paper, these assumptions are relaxed and a more general problem is solved. The computational experiment leads to the conclusion that MILP is a technique suited to dealing with the problem.
Resumo:
This paper presents a probabilistic approach to model the problem of power supply voltage fluctuations. Error probability calculations are shown for some 90-nm technology digital circuits.The analysis here considered gives the timing violation error probability as a new design quality factor in front of conventional techniques that assume the full perfection of the circuit. The evaluation of the error bound can be useful for new design paradigms where retry and self-recoveringtechniques are being applied to the design of high performance processors. The method here described allows to evaluate the performance of these techniques by means of calculating the expected error probability in terms of power supply distribution quality.
Resumo:
Quinupristin-dalfopristin (Q-D) is an injectable streptogramin active against most gram-positive pathogens, including methicillin-resistant Staphylococcus aureus (MRSA). In experimental endocarditis, however, Q-D was less efficacious against MRSA isolates constitutively resistant to macrolide-lincosamide-streptogram B (C-MLS(B)) than against MLS(B)-susceptible isolates. To circumvent this problem, we used the checkerboard method to screen drug combinations that would increase the efficacy of Q-D against such bacteria. beta-Lactams consistently exhibited additive or synergistic activity with Q-D. Glycopeptides, quinolones, and aminoglycosides were indifferent. No drugs were antagonistic. The positive Q-D-beta-lactam interaction was independent of MLS(B) or beta-lactam resistance. Moreover, addition of Q-D at one-fourth the MIC to flucloxacillin-containing plates decreased the flucloxacillin MIC for MRSA from 500 to 1,000 mg/liter to 30 to 60 mg/liter. Yet, Q-D-beta-lactam combinations were not synergistic in bactericidal tests. Rats with aortic vegetations were infected with two C-MLS(B)-resistant MRSA isolates (isolates AW7 and P8) and were treated for 3 or 5 days with drug dosages simulating the following treatments in humans: (i) Q-D at 7 mg/kg two times a day (b.i.d.) (a relatively low dosage purposely used to help detect positive drug interactions), (ii) cefamandole at constant levels in serum of 30 mg/liter, (iii) cefepime at 2 g b.i.d., (iv) Q-D combined with either cefamandole or cefepime. Any of the drugs used alone resulted in treatment failure. In contrast, Q-D plus either cefamandole or cefepime significantly decreased valve infection compared to the levels of infection for both untreated controls and those that received monotherapy (P < 0.05). Importantly, Q-D prevented the growth of highly beta-lactam-resistant MRSA in vivo. The mechanism of this beneficial drug interaction is unknown. However, Q-D-beta-lactam combinations might be useful for the treatment of complicated infections caused by multiple organisms, including MRSA.
Resumo:
MOTIVATION: Comparative analyses of gene expression data from different species have become an important component of the study of molecular evolution. Thus methods are needed to estimate evolutionary distances between expression profiles, as well as a neutral reference to estimate selective pressure. Divergence between expression profiles of homologous genes is often calculated with Pearson's or Euclidean distance. Neutral divergence is usually inferred from randomized data. Despite being widely used, neither of these two steps has been well studied. Here, we analyze these methods formally and on real data, highlight their limitations and propose improvements. RESULTS: It has been demonstrated that Pearson's distance, in contrast to Euclidean distance, leads to underestimation of the expression similarity between homologous genes with a conserved uniform pattern of expression. Here, we first extend this study to genes with conserved, but specific pattern of expression. Surprisingly, we find that both Pearson's and Euclidean distances used as a measure of expression similarity between genes depend on the expression specificity of those genes. We also show that the Euclidean distance depends strongly on data normalization. Next, we show that the randomization procedure that is widely used to estimate the rate of neutral evolution is biased when broadly expressed genes are abundant in the data. To overcome this problem, we propose a novel randomization procedure that is unbiased with respect to expression profiles present in the datasets. Applying our method to the mouse and human gene expression data suggests significant gene expression conservation between these species. CONTACT: marc.robinson-rechavi@unil.ch; sven.bergmann@unil.ch SUPPLEMENTARY INFORMATION: Supplementary data are available at Bioinformatics online.
Resumo:
At the beginning of the 21st century, some Catalan university libraries detected a need stemming from the lack of space and the reconversion of physical libraries within the new European educational panorama. With the same cooperative spirit that characterized previous CBUC (Consortium of Academic Libraries of Catalonia) programs and services, the Consortium set in motion a project to address this need. An initial study was commissioned in 2002, and in 2003 a suitable building (old infantry barracks) was found in Lleida. The official opening took place in 2008. GEPA (Guaranteed Space for the Preservation of Access) facility is a cooperative repository, whose objectives are to store and preserve low use documents, ensuring their future access when needed, to convert room for books into room for library users, and doing it saving both space and money. The paper presents a brief historical introduction about the physical management of collections in libraries, and a short overview about high density library repositories all over the world, as an answer to the pressing problem of lack of spaces. The main goals of the communication are to comment the architectural project and its librarian issues, and to show how the GEPA facility allowed to change the spaces in university libraries in Catalonia. On the one hand, the paper deals with the selection of an old building to be renovated, the determination of the librarian needs, the compact shelving system chosen to store the documents in the building, the relation between physical space and information management, and the logistics involved in the load of low use documents from the libraries into the facility. On the other hand, we will show some examples of physical changes in Catalan libraries after large loads of documents to GEPA.
Resumo:
A new approach to mammographic mass detection is presented in this paper. Although different algorithms have been proposed for such a task, most of them are application dependent. In contrast, our approach makes use of a kindred topic in computer vision adapted to our particular problem. In this sense, we translate the eigenfaces approach for face detection/classification problems to a mass detection. Two different databases were used to show the robustness of the approach. The first one consisted on a set of 160 regions of interest (RoIs) extracted from the MIAS database, being 40 of them with confirmed masses and the rest normal tissue. The second set of RoIs was extracted from the DDSM database, and contained 196 RoIs containing masses and 392 with normal, but suspicious regions. Initial results demonstrate the feasibility of using such approach with performances comparable to other algorithms, with the advantage of being a more general, simple and cost-effective approach
Resumo:
Pro gradu- tutkielma tehdään Lappeenrannan kaupungin tarkastuslautakunnan toimeksiannosta. Tutkimuksen kohteena on Lappeenrannan kaupungin neljä lautakuntaa. Tutkimuksen tarkoituksena on arvioida näiden lautakuntien toimintaa. Tutkimuksessa selvitetään, tekevätkö lautakunnat sitä työtä mitä niiden tulisi tehdä ja mitä niiden odotetaan tekevän. Varsinainen tutkimusongelma on: kuinka hyvin lautakunnan ja virkamiesjohdon väliset odotukset lautakuntien toiminnasta kohtaavat? Kyseessä on kvalitatiivinen casetutkimus. Haastatteluissa käytetään tutkimusmenetelmänä teemahaastattelua ja lautakuntien jäsenille tehdään kirjallinen kysely puolistrukturoidun kyselyn muodossa. Aineisto analysoidaan teemoittelemalla. Tutkimuksen johtopäätöksenä esitetään, että virkamiesjohdon, lautakuntien puheenjohtajien ja lautakunnan odotukset lautakuntien toimintaa kohtaan eivät aina kohtaa. Lautakunnan puheenjohtaja ja toimialajohtaja voivat olla odotuksissaan hyvin samoilla linjoilla, mutta lautakunnan jäsenet voivatkin odottaa aivan eri asioita.
Resumo:
Analyzing the state of the art in a given field in order to tackle a new problem is always a mandatory task. Literature provides surveys based on summaries of previous studies, which are often based on theoretical descriptions of the methods. An engineer, however, requires some evidence from experimental evaluations in order to make the appropriate decision when selecting a technique for a problem. This is what we have done in this paper: experimentally analyzed a set of representative state-of-the-art techniques in the problem we are dealing with, namely, the road passenger transportation problem. This is an optimization problem in which drivers should be assigned to transport services, fulfilling some constraints and minimizing some function cost. The experimental results have provided us with good knowledge of the properties of several methods, such as modeling expressiveness, anytime behavior, computational time, memory requirements, parameters, and free downloadable tools. Based on our experience, we are able to choose a technique to solve our problem. We hope that this analysis is also helpful for other engineers facing a similar problem
Resumo:
Yritysostojen määrä on historiallisen suuri 2000-luvulla, vaikka melkein puolet niistä epäonnistuu. Aineettomilla tekijöillä, kuten organisaatiokulttuureilla, on keskeinen rooli yritysostojen onnistumisissa. Myös case yritys on aktiivinen yritysostoissa ja haluaa arvioida integraatioprosessinsa tehokkuutta. Siten diplomityön tarkoituksena on luoda työkalu organisaatiokulttuurien yhteensopivuuden arvioimiseksi, jotta ostopäätöksentekoa sekä integraation suunnittelua voitaisiin tukea paremmin yrityksessä. Diplomityö vastaakin kysymyksiin, kuten miten arvioida kulttuurista yhteensopivuutta ennen integraatiota integraatioprosessin parantamiseksi sekä mitkä ovat olleet kaikkein ongelmallisimmat ja toisaalta kaikkein menestyksekkäimmät kulttuuritekijät tutkitussa integraatiossa. Kulttuurisen yhteensopivuuden arviointi tulisi nähdä prosessina osana yrityskauppaa. Prosessin tulisi alkaa kulttuurisen integraation tavoitteiden määrittämisellä sekä organisaatiokulttuurin käsitteen ymmärtämisellä. Kulttuurianalyysi tulisi suorittaa työpajan avulla. Sen tulisi käsitellä ainakin yhdeksän kulttuurin osa-aluetta: innovatiivisuus, päätöksenteko, ihmissuuntautuneisuus, kommunikaatio, kontrolli, asiakassuuntautuneisuus, ajanhallinta, identifikaatio, sekä kollektivismi. Lisäksi kuhunkin dimensioon liittyvään kysymykseen tulisi vastata pisteillä yhdestä viiteen, jolloin voidaan piirtää kulttuurisen yhteensopivuuden kuvio. Tämän jälkeen johdon tulisi keskustella tuloksista vielä kerran tarkemmin ja lopulta koota tulokset kirjalliseksi raportiksi. Tutkitussa integraatiossa parhaiten integraatiota tukivat ihmissuuntautuneisuus sekä ajanhallinta (työn ja vapaa-ajan välinen tasapaino sekä tulevaisuus-suuntautuneisuus). Haasteellisimmat kulttuuritekijät koskivat päätöksentekoa, kommunikaatiota ja kontrollia, jotka vaikuttavat olevan tyypillisiä ongelmia ison yrityksen ostaessa pienemmän yrityksen.
Resumo:
This work presents a formulation of the contact with friction between elastic bodies. This is a non linear problem due to unilateral constraints (inter-penetration of bodies) and friction. The solution of this problem can be found using optimization concepts, modelling the problem as a constrained minimization problem. The Finite Element Method is used to construct approximation spaces. The minimization problem has the total potential energy of the elastic bodies as the objective function, the non-inter-penetration conditions are represented by inequality constraints, and equality constraints are used to deal with the friction. Due to the presence of two friction conditions (stick and slip), specific equality constraints are present or not according to the current condition. Since the Coulomb friction condition depends on the normal and tangential contact stresses related to the constraints of the problem, it is devised a conditional dependent constrained minimization problem. An Augmented Lagrangian Method for constrained minimization is employed to solve this problem. This method, when applied to a contact problem, presents Lagrange Multipliers which have the physical meaning of contact forces. This fact allows to check the friction condition at each iteration. These concepts make possible to devise a computational scheme which lead to good numerical results.
Resumo:
Conyza spp. are widely responsible for yield losses in agriculture due to its worldwide occurrence, resistance to herbicides and other traits which turn these species into first grade weeds. Since the 1980's, these species started to be cited on books both related to the ecology and the weed science, being usually classified as ruderals. Occurrence of Conyza in crops shows that these species are highly adaptable due to its recent evolutionary origin and occur in environments prone concomitantly to a moderate set of competition, disturbance and stress. There are also limitations in Grime's theory which may lead us to mistakes about the behavior of Conyza. Thus, simple and isolated recommendations certainly will not solve the problem of Conyza. Neither soil tillage nor tolerant crops to 2,4-D will free the agriculture from this weed, being necessary an integrated approach to solve this problem which demands qualified human resources in weed science and planning.
Resumo:
Parents of children with autism spectrum disorders (ASD) and developmental delays (DD) may experience more child problem behaviours, report lower parenting selfefficacy (PSE), and be more reactive than proactive in their parenting strategies than those who have children with typical development (TD). Differences in PSE and parenting strategies may also influence the extent to which child problem behaviours are experienced by parents who have children with ASD and DD, compared to those who have children with TD. Using a convenience sample of parents of children with ASD (n = 48), DD (n = 51), and TD (n = 72), this study examined group differences on three key variables: PSE, parenting strategies, and child problem behaviour. Results indicated that those in the DD group scored lower on PSE in preventing child problem behaviour than the ASD group. The TD group used fewer reactive strategies than the DD group, and fewer proactive strategies than both the ASD and DD groups. For the overall sample, higher reactive strategies use was found to predict higher ratings of child problem behaviour, while a greater proportion of proactive to reactive strategies use predicted lower ratings of child problem behaviour. PSE was found to moderate DD diagnosis and child problem behaviour. Implications for a behavioural (i.e., parenting strategies) or cognitive (i.e., PSE) approach to parenting are discussed.
Resumo:
The aim of this paper is to demonstrate that, even if Marx's solution to the transformation problem can be modified, his basic conclusions remain valid. the proposed alternative solution which is presented hare is based on the constraint of a common general profit rate in both spaces and a money wage level which will be determined simultaneously with prices.