954 resultados para Suppliers selection problem


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Jordens ekologiska system undergår för tillfället stora förändringar pga. människans aktiviteter. Ett växande antal studier visar att dessa förändringar påverkar naturliga och sexuella urvalet och därmed evolutiva processer. Målet med detta arbete var att undersöka effekterna av omgivningsförändringar på sexuella urvalet genom att använda den ökade övergödningen inom storpiggen Gasterosteus aculeatus lekområden som modell system. Sexuella urvalet är en viktig evolutiv kraft med följder på populations- och artnivå (Kapitel 1). Avhandlingens olika delar fokuserar på övergödningens effekter på upptäckandet av partners, användningen av visuella- och doftsignaler i partnersval, och fördelningen av parningsframgången mellan bobyggande hanar. I Kapitel II och III simuleras hur grumlighet orsakad av fytoplankton påverkar hastigheten med vilken potentiella partners påträffas, genom effekter på synligheten. Resultaten visar att normala algblomningar i Östersjön har en måttlig effekt på finnandet av potentiella partners. Detta tyder på att algblomningarna troligen inte kommer att minska på selektiva parningen pga. ökade sökkostnader. I Kapitel IV visas att storspiggen ändrar relativa användningen av olika signaler när vattnets grumlighet ökar; visuella signaler minskar i betydelse medan doftsignaler ökar i betydelse. Samtidigt underlättas användandet av doftsignaler av ändringar i vattnets kemiska sammansättning då fotosyntesen intensifieras (Kapitel V). Lek i övergödda vatten kan ändå vara kostsamt både på individ- och populationsnivån, då parasiterade hanar, som troligen är dåligt genetiskt anpassade till sin miljö, lyckas få mer ägg i sina bon än friskare hanar som troligen är av högre genetisk kvalitet (Kapitel VI). Övergödningen påverkar således partnersval och konkurrensen om partners genom att påverka upptäckandet av potentiella partners, evalueringen av partners och fördelningen av partners inom lekområdena. De följder detta kan ha för evolutionen av sexuellt selekterad egenskaper och för populationers dynamik och livskraft är dock oklara. Avhandlingen visar på svårigheten att förutse följderna av omgivningsförändringar för sexuella urvalet och effekterna på individ och populationsnivå.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, pattern classification problem in tool wear monitoring is solved using nature inspired techniques such as Genetic Programming(GP) and Ant-Miner (AM). The main advantage of GP and AM is their ability to learn the underlying data relationships and express them in the form of mathematical equation or simple rules. The extraction of knowledge from the training data set using GP and AM are in the form of Genetic Programming Classifier Expression (GPCE) and rules respectively. The GPCE and AM extracted rules are then applied to set of data in the testing/validation set to obtain the classification accuracy. A major attraction in GP evolved GPCE and AM based classification is the possibility of obtaining an expert system like rules that can be directly applied subsequently by the user in his/her application. The performance of the data classification using GP and AM is as good as the classification accuracy obtained in the earlier study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Scratch assays are difficult to reproduce. Here we identify a previously overlooked source of variability which could partially explain this difficulty. We analyse a suite of scratch assays in which we vary the initial degree of confluence (initial cell density). Our results indicate that the rate of re-colonisation is very sensitive to the initial density. To quantify the relative roles of cell migration and proliferation, we calibrate the solution of the Fisher–Kolmogorov model to cell density profiles to provide estimates of the cell diffusivity, D, and the cell proliferation rate, λ. This procedure indicates that the estimates of D and λ are very sensitive to the initial density. This dependence suggests that the Fisher–Kolmogorov model does not accurately represent the details of the collective cell spreading process, since this model assumes that D and λ are constants that ought to be independent of the initial density. Since higher initial cell density leads to enhanced spreading, we also calibrate the solution of the Porous–Fisher model to the data as this model assumes that the cell flux is an increasing function of the cell density. Estimates of D and λ associated with the Porous–Fisher model are less sensitive to the initial density, suggesting that the Porous–Fisher model provides a better description of the experiments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the two decades since the study by Kavanagh et al. (1993) has given additional insights into effective dissemination of family interventions, the accompanying papers show that progress remains limited. The effectiveness trial that triggered this series of papers offers a cautionary tale. Despite management support, 30–35 hr of workshop training and training of local supervisors who could act as champions, use of the full intervention was limited. In part, this seemed due to the demanding nature of the intervention and its incompatibility with practitioners’ roles, in part, to limitations in the training, among other factors. While the accompanying papers note these and other barriers to dissemination, they miss a more disturbing finding in the original paper: Practitioners said they were using several aspects in routine care, despite being unable to accurately describe what they were. This finding highlights the risks in taking practitioners’ reports of their practice in files or supervision sessions at face value and potentially has implications for reports of other clinical work. The fidelity of disseminated treatments can only be assured by audits of practice, accompanied by affirming but also corrective feedback.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Generating discriminative input features is a key requirement for achieving highly accurate classifiers. The process of generating features from raw data is known as feature engineering and it can take significant manual effort. In this paper we propose automated feature engineering to derive a suite of additional features from a given set of basic features with the aim of both improving classifier accuracy through discriminative features, and to assist data scientists through automation. Our implementation is specific to HTTP computer network traffic. To measure the effectiveness of our proposal, we compare the performance of a supervised machine learning classifier built with automated feature engineering versus one using human-guided features. The classifier addresses a problem in computer network security, namely the detection of HTTP tunnels. We use Bro to process network traffic into base features and then apply automated feature engineering to calculate a larger set of derived features. The derived features are calculated without favour to any base feature and include entropy, length and N-grams for all string features, and counts and averages over time for all numeric features. Feature selection is then used to find the most relevant subset of these features. Testing showed that both classifiers achieved a detection rate above 99.93% at a false positive rate below 0.01%. For our datasets, we conclude that automated feature engineering can provide the advantages of increasing classifier development speed and reducing development technical difficulties through the removal of manual feature engineering. These are achieved while also maintaining classification accuracy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Description of the work Shrinking Violets is comprised of two half scale garments in laser cut silk organza, developed with a knotting device to allow for disassembly and reassembly. The first is a jacket in layered red organza including black storm flap details. The second is a vest in jade organza with circles of pink organza attached through a pattern of knots. Research Background This practice-led fashion design research sits within the field of Design for Sustainability (DfS) in fashion that seeks to mitigate the environmental and ethical impacts of fashion consumption and production. The research explores new systems of garment construction for DfS, and examines how these systems may involve ‘designing’ new user interactions with the garments. The garments’ construction system allows them to be disassembled and recycled or reassembled by users to form a new garment. Conventional garment design follows a set process of cutting and construction, with pattern pieces permanently machine-stitched together. Garments typically contain multiple fibre types; for example a jacket may be constructed from a shell of wool/polyester, an acetate lining, fusible interlinings, and plastic buttons. These complex inputs mean that textile recycling is highly labour intensive, first to separate the garment pieces and second to sort the multiple fibre types. This difficulty results in poor quality ‘shoddy’ comprised of many fibre types and unsuitable for new apparel, or in large quantities of recyclable textile waste sent to landfill (Hawley 2011). Design-led approaches that consider the garment’s end of life in the design process are a way of addressing this problem. In Gulich’s (2006) analysis, use of single materials is the most effective way to ensure ease of recycling, with multiple materials that can be detached next in effectiveness. Given the low rate of technological innovation in most apparel manufacturing (Ruiz 2011), a challenge for effective recycling is how to develop new manufacturing methods that allow for garments to be more easily disassembled at end-of-life. Research Contribution This project addresses the research question: How can design for disassembly be considered within the fashion design process? I have employed a practice-led methodology in which my design process leads the research, making use of methods of fashion design practice including garment and construction research, fabric and colour research, textile experimentation, drape, patternmaking, and illustration as well as more recent methods such as laser cutting. Interrogating the traditional approaches to garment construction is necessarily a technical process; however fashion design is as much about the aesthetic and desirability of a garment as it is about the garment’s pragmatics or utility. This requires a balance between the technical demands of designing for disassembly with the aesthetic demands of fashion. This led to the selection of luxurious, semi-transparent fabrics in bold floral colours that could be layered to create multiple visual effects, as well as the experimentation with laser cutting for new forms of finishing and fastening the fabrics together. Shrinking Violets makes two contributions to new knowledge in the area of design for sustainability within fashion. The first is in the technical development of apparel modularity through the system of laser cut holes and knots that also become a patterning device. The second contribution lies in the design of a system for users to engage with the garment through its ability to be easily reconstructed into a new form. Research Significance Shrinking Violets was exhibited at the State Library of Queensland’s Asia Pacific Design Library, 1-5 November 2015, as part of The International Association of Societies of Design Research’s (IASDR) biannual design conference. The work was chosen for display by a panel of experts, based on the criteria of design innovation and contribution to new knowledge in design. References Gulich, B. (2006). Designing textile products that are easy to recycle. In Y. Wang (Ed.), Recycling in Textiles (pp. 25-37). London: Woodhead. Hawley, J. M. (2011). Textile recycling options: exploring what could be. In A. Gwilt & T. Rissanen (Eds.), Shaping Sustainable Fashion: Changing the way we make and use clothes (pp. 143 - 155). London: Earthscan. Ruiz, B. (2014). Global Apparel Manufacturing. Retrieved 10 August 2014, from http://clients1.ibisworld.com/reports/gl/industry/default.aspx?entid=470

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Swarm Intelligence techniques such as particle swarm optimization (PSO) are shown to be incompetent for an accurate estimation of global solutions in several engineering applications. This problem is more severe in case of inverse optimization problems where fitness calculations are computationally expensive. In this work, a novel strategy is introduced to alleviate this problem. The proposed inverse model based on modified particle swarm optimization algorithm is applied for a contaminant transport inverse model. The inverse models based on standard-PSO and proposed-PSO are validated to estimate the accuracy of the models. The proposed model is shown to be out performing the standard one in terms of accuracy in parameter estimation. The preliminary results obtained using the proposed model is presented in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Security in a mobile communication environment is always a matter for concern, even after deploying many security techniques at device, network, and application levels. The end-to-end security for mobile applications can be made robust by developing dynamic schemes at application level which makes use of the existing security techniques varying in terms of space, time, and attacks complexities. In this paper we present a security techniques selection scheme for mobile transactions, called the Transactions-Based Security Scheme (TBSS). The TBSS uses intelligence to study, and analyzes the security implications of transactions under execution based on certain criterion such as user behaviors, transaction sensitivity levels, and credibility factors computed over the previous transactions by the users, network vulnerability, and device characteristics. The TBSS identifies a suitable level of security techniques from the repository, which consists of symmetric, and asymmetric types of security algorithms arranged in three complexity levels, covering various encryption/decryption techniques, digital signature schemes, andhashing techniques. From this identified level, one of the techniques is deployed randomly. The results shows that, there is a considerable reduction in security cost compared to static schemes, which employ pre-fixed security techniques to secure the transactions data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Experimental characterization of high dimensional dynamic systems sometimes uses the proper orthogonal decomposition (POD). If there are many measurement locations and relatively fewer sensors, then steady-state behavior can still be studied by sequentially taking several sets of simultaneous measurements. The number required of such sets of measurements can be minimized if we solve a combinatorial optimization problem. We aim to bring this problem to the attention of engineering audiences, summarize some known mathematical results about this problem, and present a heuristic (suboptimal) calculation that gives reasonable, if not stellar, results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence-based policy is a means of ensuring that policy is informed by more than ideology or expedience. However, what constitutes robust evidence is highly contested. In this paper, we argue policy must draw on quantitative and qualitative data. We do this in relation to a long entrenched problem in Australian early childhood education and care (ECEC) workforce policy. A critical shortage of qualified staff threatens the attainment of broader child and family policy objectives linked to the provision of ECEC and has not been successfully addressed by initiatives to date. We establish some of the limitations of existing quantitative data sets and consider the potential of qualitative studies to inform ECEC workforce policy. The adoption of both quantitative and qualitative methods is needed to illuminate the complex nature of the work undertaken by early childhood educators, as well as the environmental factors that sustain job satisfaction in a demanding and poorly understood working environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We discuss a technique for solving the Landau-Zener (LZ) problem of finding the probability of excitation in a two-level system. The idea of time reversal for the Schrodinger equation is employed to obtain the state reached at the final time and hence the excitation probability. Using this method, which can reproduce the well-known expression for the LZ transition probability, we solve a variant of the LZ problem, which involves waiting at the minimum gap for a time t(w); we find an exact expression for the excitation probability as a function of t(w). We provide numerical results to support our analytical expressions. We then discuss the problem of waiting at the quantum critical point of a many-body system and calculate the residual energy generated by the time-dependent Hamiltonian. Finally, we discuss possible experimental realizations of this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: To determine the extent to which different strength training exercises selectively activate the commonly injured biceps femoris long head (BFLH) muscle. Methods: This two-part observational study recruited 24 recreationally active males. Part 1 explored the amplitudes and the ratios of lateral to medial hamstring (BF/MH) normalised electromyography (nEMG) during the concentric and eccentric phases of 10 common strength training exercises. Part 2 used functional magnetic resonance imaging (fMRI) to determine the spatial patterns of hamstring activation during two exercises which i) most selectively, and ii) least selectively activated the BF in part 1. Results: Eccentrically, the largest BF/MH nEMG ratio was observed in the 45° hip extension exercise and the lowest was observed in the Nordic hamstring (NHE) and bent-knee bridge exercises. Concentrically, the highest BF/MH nEMG ratio was observed during the lunge and 45° hip extension and the lowest was observed for the leg curl and bent-knee bridge. fMRI revealed a greater BFLH to semitendinosus activation ratio in the 45° hip extension than the NHE (p<0.001). The T2 increase after hip extension for BFLH, semitendinosus and semimembranosus muscles were greater than that for BFSH (p<0.001). During the NHE, the T2 increase was greater for the semitendinosus than for the other hamstrings (p≤0.002). Conclusion: This investigation highlights the non-uniformity of hamstring activation patterns in different tasks and suggests that hip extension exercise more selectively activates the BFLH while the NHE preferentially recruits the semitendinosus. These findings have implications for strength training interventions aimed at preventing hamstring injury.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prior to embarking on further study into the subject of relevance it is essential to consider why the concept of relevance has remained inconclusive, despite extensive research and its centrality to the discipline of information science. The approach taken in this paper is to reconstruct the science of information retrieval from first principles including the problem statement, role, scope and objective. This framework for document selection is put forward as a straw man for comparison with the historical relevance models. The paper examines five influential relevance models over the past 50 years. Each is examined with respect to its treatment of relevance and compared with the first principles model to identify contributions and deficiencies. The major conclusion drawn is that relevance is a significantly overloaded concept which is both confusing and detrimental to the science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Let G = (V,E) be a simple, finite, undirected graph. For S ⊆ V, let $\delta(S,G) = \{ (u,v) \in E : u \in S \mbox { and } v \in V-S \}$ and $\phi(S,G) = \{ v \in V -S: \exists u \in S$ , such that (u,v) ∈ E} be the edge and vertex boundary of S, respectively. Given an integer i, 1 ≤ i ≤ ∣ V ∣, the edge and vertex isoperimetric value at i is defined as b e (i,G) =  min S ⊆ V; |S| = i |δ(S,G)| and b v (i,G) =  min S ⊆ V; |S| = i |φ(S,G)|, respectively. The edge (vertex) isoperimetric problem is to determine the value of b e (i, G) (b v (i, G)) for each i, 1 ≤ i ≤ |V|. If we have the further restriction that the set S should induce a connected subgraph of G, then the corresponding variation of the isoperimetric problem is known as the connected isoperimetric problem. The connected edge (vertex) isoperimetric values are defined in a corresponding way. It turns out that the connected edge isoperimetric and the connected vertex isoperimetric values are equal at each i, 1 ≤ i ≤ |V|, if G is a tree. Therefore we use the notation b c (i, T) to denote the connected edge (vertex) isoperimetric value of T at i. Hofstadter had introduced the interesting concept of meta-fibonacci sequences in his famous book “Gödel, Escher, Bach. An Eternal Golden Braid”. The sequence he introduced is known as the Hofstadter sequences and most of the problems he raised regarding this sequence is still open. Since then mathematicians studied many other closely related meta-fibonacci sequences such as Tanny sequences, Conway sequences, Conolly sequences etc. Let T 2 be an infinite complete binary tree. In this paper we related the connected isoperimetric problem on T 2 with the Tanny sequences which is defined by the recurrence relation a(i) = a(i − 1 − a(i − 1)) + a(i − 2 − a(i − 2)), a(0) = a(1) = a(2) = 1. In particular, we show that b c (i, T 2) = i + 2 − 2a(i), for each i ≥ 1. We also propose efficient polynomial time algorithms to find vertex isoperimetric values at i of bounded pathwidth and bounded treewidth graphs.