941 resultados para Algorithms, Properties, the KCube Graphs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the complexity of rationalizing choice behavior. We do so by analyzing two polar cases, and a number of intermediate ones. In our most structured case, that is where choice behavior is defined in universal choice domains and satisfies the "weak axiom of revealed preference," finding the complete preorder rationalizing choice behavior is a simple matter. In the polar case, where no restriction whatsoever is imposed, either on choice behavior or on choice domain, finding the complete preordersthat rationalize behavior turns out to be intractable. We show that the task of finding the rationalizing complete preorders is equivalent to a graph problem. This allows the search for existing algorithms in the graph theory literature, for the rationalization of choice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present some results attained with different algorithms for the Fm|block|Cmax problem using as experimental data the well-known Taillard instances.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The right of a person to be protected from natural hazards is a characteristic of the social and economical development of the society. This paper is a contribution to the reflection about the role of Civil Protection organizations in a modern society. The paper is based in the inaugural conference made by the authors on the 9th Plinius Conference on Mediterranean Storms. Two major issues are considered. The first one is sociological; the Civil Protection organizations and the responsible administration of the land use planning should be perceived as reliable as possible, in order to get consensus on the restrictions they pose, temporary or definitely, on the individual free use of the territory as well as in the entire warning system. The second one is technological: in order to be reliable they have to issue timely alert and warning to the population at large, but such alarms should be as "true" as possible. With this aim, the paper summarizes the historical evolution of the risk assessment, starting from the original concept of "hazard", introducing the concepts of "scenario of event" and "scenario of risk" and ending with a discussion about the uncertainties and limits of the most advanced and efficient tools to predict, to forecast and to observe the ground effects affecting people and their properties. The discussion is centred in the case of heavy rains and flood events in the North-West of Mediterranean Region.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have shown that a patient's antibody reaction in a confirmatory line immunoassay (INNO-LIATM HIV I/II Score, Innogenetics) provides information on the duration of infection. Here, we sought to further investigate the diagnostic specificity of various Inno-Lia algorithms and to identify factors affecting it. METHODS: Plasma samples of 714 selected patients of the Swiss HIV Cohort Study infected for longer than 12 months and representing all viral clades and stages of chronic HIV-1 infection were tested blindly by Inno-Lia and classified as either incident (up to 12 m) or older infection by 24 different algorithms. Of the total, 524 patients received HAART, 308 had HIV-1 RNA below 50 copies/mL, and 620 were infected by a HIV-1 non-B clade. Using logistic regression analysis we evaluated factors that might affect the specificity of these algorithms. RESULTS: HIV-1 RNA <50 copies/mL was associated with significantly lower reactivity to all five HIV-1 antigens of the Inno-Lia and impaired specificity of most algorithms. Among 412 patients either untreated or with HIV-1 RNA ≥50 copies/mL despite HAART, the median specificity of the algorithms was 96.5% (range 92.0-100%). The only factor that significantly promoted false-incident results in this group was age, with false-incident results increasing by a few percent per additional year. HIV-1 clade, HIV-1 RNA, CD4 percentage, sex, disease stage, and testing modalities exhibited no significance. Results were similar among 190 untreated patients. CONCLUSIONS: The specificity of most Inno-Lia algorithms was high and not affected by HIV-1 variability, advanced disease and other factors promoting false-recent results in other STARHS. Specificity should be good in any group of untreated HIV-1 patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The clathrin assembly lymphoid myeloid leukemia (CALM) gene encodes a putative homologue of the clathrin assembly synaptic protein AP180. Hence the biochemical properties, the subcellular localization, and the role in endocytosis of a CALM protein were studied. In vitro binding and coimmunoprecipitation demonstrated that the clathrin heavy chain is the major binding partner of CALM. The bulk of cellular CALM was associated with the membrane fractions of the cell and localized to clathrin-coated areas of the plasma membrane. In the membrane fraction, CALM was present at near stoichiometric amounts relative to clathrin. To perform structure-function analysis of CALM, we engineered chimeric fusion proteins of CALM and its fragments with the green fluorescent protein (GFP). GFP-CALM was targeted to the plasma membrane-coated pits and also found colocalized with clathrin in the Golgi area. High levels of expression of GFP-CALM or its fragments with clathrin-binding activity inhibited the endocytosis of transferrin and epidermal growth factor receptors and altered the steady-state distribution of the mannose-6-phosphate receptor in the cell. In addition, GFP-CALM overexpression caused the loss of clathrin accumulation in the trans-Golgi network area, whereas the localization of the clathrin adaptor protein complex 1 in the trans-Golgi network remained unaffected. The ability of the GFP-tagged fragments of CALM to affect clathrin-mediated processes correlated with the targeting of the fragments to clathrin-coated areas and their clathrin-binding capacities. Clathrin-CALM interaction seems to be regulated by multiple contact interfaces. The C-terminal part of CALM binds clathrin heavy chain, although the full-length protein exhibited maximal ability for interaction. Altogether, the data suggest that CALM is an important component of coated pit internalization machinery, possibly involved in the regulation of clathrin recruitment to the membrane and/or the formation of the coated pit.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complete Raman spectrum of SnO2 nanoparticles in presented and analyzed. In addition to the "classical" modes observed in the rutile structure, two other regions shown Raman activity for nanoparticles. The Raman bands in the low-frequency region are attributed to acoustic modes associated with the vibration of the individual nanoparticle as a whole. The high-frequency region is activated by surface disorder. A detailed analysis of these regions and the changes in the normal modes of SnO2 are presented as a function nanoparticle size.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The coverage and volume of geo-referenced datasets are extensive and incessantly¦growing. The systematic capture of geo-referenced information generates large volumes¦of spatio-temporal data to be analyzed. Clustering and visualization play a key¦role in the exploratory data analysis and the extraction of knowledge embedded in¦these data. However, new challenges in visualization and clustering are posed when¦dealing with the special characteristics of this data. For instance, its complex structures,¦large quantity of samples, variables involved in a temporal context, high dimensionality¦and large variability in cluster shapes.¦The central aim of my thesis is to propose new algorithms and methodologies for¦clustering and visualization, in order to assist the knowledge extraction from spatiotemporal¦geo-referenced data, thus improving making decision processes.¦I present two original algorithms, one for clustering: the Fuzzy Growing Hierarchical¦Self-Organizing Networks (FGHSON), and the second for exploratory visual data analysis:¦the Tree-structured Self-organizing Maps Component Planes. In addition, I present¦methodologies that combined with FGHSON and the Tree-structured SOM Component¦Planes allow the integration of space and time seamlessly and simultaneously in¦order to extract knowledge embedded in a temporal context.¦The originality of the FGHSON lies in its capability to reflect the underlying structure¦of a dataset in a hierarchical fuzzy way. A hierarchical fuzzy representation of¦clusters is crucial when data include complex structures with large variability of cluster¦shapes, variances, densities and number of clusters. The most important characteristics¦of the FGHSON include: (1) It does not require an a-priori setup of the number¦of clusters. (2) The algorithm executes several self-organizing processes in parallel.¦Hence, when dealing with large datasets the processes can be distributed reducing the¦computational cost. (3) Only three parameters are necessary to set up the algorithm.¦In the case of the Tree-structured SOM Component Planes, the novelty of this algorithm¦lies in its ability to create a structure that allows the visual exploratory data analysis¦of large high-dimensional datasets. This algorithm creates a hierarchical structure¦of Self-Organizing Map Component Planes, arranging similar variables' projections in¦the same branches of the tree. Hence, similarities on variables' behavior can be easily¦detected (e.g. local correlations, maximal and minimal values and outliers).¦Both FGHSON and the Tree-structured SOM Component Planes were applied in¦several agroecological problems proving to be very efficient in the exploratory analysis¦and clustering of spatio-temporal datasets.¦In this thesis I also tested three soft competitive learning algorithms. Two of them¦well-known non supervised soft competitive algorithms, namely the Self-Organizing¦Maps (SOMs) and the Growing Hierarchical Self-Organizing Maps (GHSOMs); and the¦third was our original contribution, the FGHSON. Although the algorithms presented¦here have been used in several areas, to my knowledge there is not any work applying¦and comparing the performance of those techniques when dealing with spatiotemporal¦geospatial data, as it is presented in this thesis.¦I propose original methodologies to explore spatio-temporal geo-referenced datasets¦through time. Our approach uses time windows to capture temporal similarities and¦variations by using the FGHSON clustering algorithm. The developed methodologies¦are used in two case studies. In the first, the objective was to find similar agroecozones¦through time and in the second one it was to find similar environmental patterns¦shifted in time.¦Several results presented in this thesis have led to new contributions to agroecological¦knowledge, for instance, in sugar cane, and blackberry production.¦Finally, in the framework of this thesis we developed several software tools: (1)¦a Matlab toolbox that implements the FGHSON algorithm, and (2) a program called¦BIS (Bio-inspired Identification of Similar agroecozones) an interactive graphical user¦interface tool which integrates the FGHSON algorithm with Google Earth in order to¦show zones with similar agroecological characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Due to human activity, large amounts of organic residue are generated daily. Therefore, an adequate use in agricultural activities requires the characterization of the main properties. The chemical and physical characterization is important when planning the use and management of organic residue. In this study, chemical and physical properties of charcoal, coffee husk, pine-bark, cattle manure, chicken manure, coconut fiber, sewage sludge, peat, and vermiculite were determined. The following properties were analyzed: N-NH4+, N-N0(3)-, and total concentrations of N, P, S, K, Ca, Mg, Mn, Zn, Cu, and B, as well as pH, Electrical Conductivity (EC) and bulk density. Coffee husk, sewage sludge, chicken manure and cattle manure were generally richer in nutrients. The EC values of these residues were also the highest (0.08 - 40.6 dS m-1). Peat and sewage sludge had the highest bulky density. Sodium contents varied from 0 to 4.75 g kg-1, with the highest levels in chicken manure, cattle manure and sewage sludge. Great care must be taken when establishing proportions of organic residues in the production of substrates with coffee husk, cattle or chicken manure or sewage sludge in the calculation of the applied fertilizer quantity in crop fertilization programs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increase in agricultural production in the Brazilian Amazon region is mostly a result of the agricultural frontier expansion, into areas previously influenced by humans or of native vegetation. At the same time, burning is still used to clear areas in small-scale agricultural systems, leading to a loss of the soil productive capacity shortly after, forcing the opening of new areas. This study had the objective of evaluating the effect of soil preparation methods that involve plant residue shredding, left on the surface or incorporated to the soil, with or without chemical fertilization, on the soil chemical and biological properties. The experiment was conducted in 1995, in an experimental field of Yellow Latosol (Oxisol) of the Embrapa Amazônia Oriental, northeastern Pará (Brazil). The experiment was arranged in randomized blocks, in a 2x6 factorial design, with two management systems and six treatments evaluated twice. The management systems consisted of rice (Oriza sativa), followed by cowpea (Vigna unguiculata) with manioc (Manihot esculenta). In the first system the crops were planted in two consecutive cycles, followed by a three-year fallow period (natural regrowth); the second system consisted of one cultivation cycle and was left fallow for three years. The following treatments were applied to the secondary forest vegetation: slash and burn, fertilized with NPK (Q+NPK); slash and burn, without fertilizer NPK (Q-NPK); cutting and shredding, leaving the residues on the soil surface, fertilized with NPK (C+NPK); cutting and shredding, leaving residues on the soil surface, without fertilizer (C-NPK); cutting and shredding, with residue incorporation and fertilized with NPK (I+NPK); cutting and shredding, with residue incorporation and without NPK fertilizer (I-NPK). The soil was sampled in the rainier season (April 2006) and in the drier season (September 2006), in the 0-0.1 m layer. From each plot, 10 simple samples were collected in order to generate a composite sample. In the more intensive management system the contents of microbial C (Cmic) and microbial N (Nmic) were higher, while the C (Corg) level was higher in the less intensive system. The treatments with highest Cmic and Nmic levels were those with cutting, shredding and distribution of biomass on the soil surface. Under both management systems, the chemical characteristics were in ranges that classify the soil as little fertile, although P and K (in the rainy season) were higher in the less intensive management system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report summarizes research conducted at Iowa State University on behalf of the Iowa Department of Transportation, focusing on the volumetric state of hot-mix asphalt (HMA) mixtures as they transition from stable to unstable configurations. This has raditionally been addressed during mix design by meeting a minimum voids in the mineral aggregate (VMA) requirement, based solely upon the nominal maximum aggregate size without regard to other significant aggregate-related properties. The goal was to expand the current specification to include additional aggregate properties, e.g., fineness modulus, percent crushed fine and coarse aggregate, and their interactions. The work was accomplished in three phases: a literature review, extensive laboratory testing, and statistical analysis of test results. The literature review focused on the history and development of the current specification, laboratory methods of identifying critical mixtures, and the effects of other aggregate-related factors on critical mixtures. The laboratory testing involved three maximum aggregate sizes (19.0, 12.5, and 9.5 millimeters), three gradations (coarse, fine, and dense), and combinations of natural and manufactured coarse and fine aggregates. Specimens were compacted using the Superpave Gyratory Compactor (SGC), conventionally tested for bulk and maximum theoretical specific gravities and physically tested using the Nottingham Asphalt Tester (NAT) under a repeated load confined configuration to identify the transition state from sound to unsound. The statistical analysis involved using ANOVA and linear regression to examine the effects of identified aggregate factors on critical state transitions in asphalt paving mixtures and to develop predictive equations. The results clearly demonstrate that the volumetric conditions of an HMA mixture at the stable unstable threshold are influenced by a composite measure of the maximum aggregate size and gradation and by aggregate shape and texture. The currently defined VMA criterion, while significant, is seen to be insufficient by itself to correctly differentiate sound from unsound mixtures. Under current specifications, many otherwise sound mixtures are subject to rejection solely on the basis of failing to meet the VMA requirement. Based on the laboratory data and statistical analysis, a new paradigm to volumetric mix design is proposed that explicitly accounts for aggregate factors (gradation, shape, and texture).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leaching of nitrate (NO3-) can increase the groundwater concentration of this anion and reduce the agronomical effectiveness of nitrogen fertilizers. The main soil property inversely related to NO3- leaching is the anion exchange capacity (AEC), whose determination is however too time-consuming for being carried out in soil testing laboratories. For this reason, this study evaluated if more easily measurable soil properties could be used to estimate the resistance of subsoils to NO3- leaching. Samples from the subsurface layer (20-40 cm) of 24 representative soils of São Paulo State were characterized for particle-size distribution and for chemical and electrochemical properties. The subsoil content of adsorbed NO3- was calculated from the difference between the NO3- contents extracted with 1 mol L-1 KCl and with water; furthermore, NO3- leaching was studied in miscible displacement experiments. The results of both adsorption and leaching experiments were consistent with the well-known role exerted by AEC on the nitrate behavior in weathered soils. Multiple regression analysis indicated that in subsoils with (i) low values of remaining phosphorus (Prem), (ii) low soil pH values measured in water (pH H2O), and (iii) high pH values measured in 1 moL L-1 KCl (pH KCl), the amounts of surface positive charges tend to be greater. For this reason, NO3- leaching tends to be slower in these subsoils, even under saturated flow condition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The application of sewage sludge is a concern because it may affect the quality of organic matter and microbiological and biochemical soil properties. The effects of surface application of sewage sludge to an agricultural soil (at 18 and 36 t ha-1 dry basis) were assessed in one maize (Zea mays L.) growing season. The study evaluated microbial biomass, basal respiration and selected enzymatic activities (catalase, urease, acid and alkaline phosphatase, and β-glucosidase) 230 days after sewage sludge application and infrared spectroscopy was used to assess the quality of dissolved organic matter and humic acids. Sewage sludge applications increased the band intensity assigned to polysaccharides, carboxylic acids, amides and lignin groups in the soil. The organic matter from the sewage sludge had a significant influence on the soil microbial biomass; nevertheless, at the end of the experiment the equilibrium of the soil microbial biomass (defined as microbial metabolic quotient, qCO2) was recovered. Soil urease, acid and alkaline phosphatase activity were strongly influenced by sewage sludge applications.