979 resultados para Conveying machinery


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Reeb graph of a scalar function represents the evolution of the topology of its level sets. In this video, we describe a near-optimal output-sensitive algorithm for computing the Reeb graph of scalar functions defined over manifolds. Key to the simplicity and efficiency of the algorithm is an alternate definition of the Reeb graph that considers equivalence classes of level sets instead of individual level sets. The algorithm works in two steps. The first step locates all critical points of the function in the domain. Arcs in the Reeb graph are computed in the second step using a simple search procedure that works on a small subset of the domain that corresponds to a pair of critical points. The algorithm is also able to handle non-manifold domains.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study wireless multihop energy harvesting sensor networks employed for random field estimation. The sensors sense the random field and generate data that is to be sent to a fusion node for estimation. Each sensor has an energy harvesting source and can operate in two modes: Wake and Sleep. We consider the problem of obtaining jointly optimal power control, routing and scheduling policies that ensure a fair utilization of network resources. This problem has a high computational complexity. Therefore, we develop a computationally efficient suboptimal approach to obtain good solutions to this problem. We study the optimal solution and performance of the suboptimal approach through some numerical examples.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tarkastelen tutkimuksessani muotibloggaajien kulutuspuhetta. Tutkimuksen tarkoituksena on selvittää, millaista kuluttajuutta muotiblogeissa ilmaistaan ja millainen kehityskaari muotiblogeissa on tapahtunut vuodesta 2007 tämän tutkimuksen tekemiseen asti tutkimusaineiston sekä tekemieni havaintojen perusteella. Tutkimus on toteutettu laadullisia tutkimusmenetelmiä käyttämällä. Olen kerännyt aineistoni kymmenestä naisten kirjoittamasta muotiblogista kahtena eri ajanjaksona vuonna 2009. Lisäksi tutkimuksessa on sekä etnografisen että netnografisen havainnoinnin piirteitä. Aineiston analysoinnin apuna olen käyttänyt teemoittelua sekä tyypittelyä. Tutkimuksesta selvisi, että muotibloggaajien näkeminen identiteettiään etsivänä liittyy oman tyylin etsimiseen ja erottautumisen haluun. Perinteinen näkemys kuluttajasta valitsijana ja passiivisena markkinoilla toimijana on väistymässä, sillä muotibloggaajat näyttäytyvät tutkimuksessa aktiivisina toimijoina ja tuottajina. Muotibloggaajat hakevat myös jatkuvasti uusia kulutuselämyksiä ja kommunikoivat toistensa kanssa välittämällä merkityksiä kuluttamisensa kautta. Muotibloggaajien kulutuspuheet näyttäytyvät tutkimuksessa ekonomistisen kulutuseetoksen ja perinteisen suomalaisen kulutuspuheen mukaisesti järkevinä. Säästäväisyyttä pidetään hyveellisenä ja tuotteiden hinnat vaikuttavat ostopäätösten tekemiseen. Muotibloggaajat osaavat kuitenkin kontrolloidusti nauttia kuluttamisesta. Kulutuspuheet noudattavat myös ekologis-eettisen kulutuseetoksen perinnettä, mikä ilmenee fanaattisuuden välttämisenä sekä perinteisten että sosiaalisen median kirpputorien suosimisena. Lisäksi muotibloggaajien kulutuspuheet käyvät ilmi sosiaalisena pakkona, sillä muotiblogin pitäminen mielenkiintoisena vaatii jatkuvaa himoa uusia kulutustuotteita kohtaan. Tutkimuksen perusteella muotiblogeissa korostuu yhteisöllisyys ja kuluttaminen on hyvin naisellista. Tutkimusaineistosta sekä havainnoista tekemieni päätelmien mukaan muotiblogit ovat muuttumassa elämäntyyliblogeiksi, sillä niissä kerrotaan yhä enemmän myös muista kuin kuluttamiseen liittyvistä aiheista. Samalla muotibloggaajat ovat kehittymässä tavallisesta kuluttajasta kohti asiantuntijuutta eli professionalisoitumassa. Professionalisoitumisen myötä muotibloggaajat toimivat uuden ajan kuluttajavalistajina ja muotiblogeissa syntyy uusia sanoja ja merkityksiä. Lisäksi muotibloggaajille on kehittynyt sellaisia tietoja ja taitoja, joita voi myydä. Muotibloggaajista onkin tulossa muodin ammattilaisia, joille maksetaan tulevaisuudessa bloggaamisesta myös palkkaa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Finnish forest industry is in the middle of a radical change. Deepening recession and the falling demand of woodworking industry´s traditional products have forced also sawmilling industry to find new and more fertile solutions to improve their operational preconditions. In recent years, the role of bioenergy production has often been highlighted as a part of sawmills´ business repertoire. Sawmilling produces naturally a lot of by-products (e.g. bark, sawdust, chips) which could be exploited more effectively in energy production, and this would bring more incomes or maybe even create new business opportunities for sawmills. Production of bioenergy is also supported by government´s climate and energy policies favouring renewable energy sources, public financial subsidies, and soaring prices of fossil fuels. Also the decreasing production of domestic pulp and paper industry releases a fair amount of sawmills´ by-products for other uses. However, bioenergy production as a part of sawmills´ by-product utilization has been so far researched very little from a managerial point of view. The purpose of this study was to explore the relative significance of the main bioenergy-related processes, resources and factors at Finnish independent industrial sawmills including partnerships, cooperation, customers relationships and investments, and also the future perspectives of bioenergy business at these sawmills with the help of two resource-based approaches (resource-based view, natural-resource-based view). Data of the study comprised of secondary data (e.g. literature), and primary data which was attracted from interviews directed to sawmill managers (or equivalent persons in charge of decisions regarding bioenergy production at sawmill). While a literature review and the Delphi method with two questionnaires were utilized as the methods of the study. According to the results of the study, the most significant processes related to the value chain of bioenergy business are connected to raw material availability and procurement, and customer relationships management. In addition to raw material and services, the most significant resources included factory and machinery, personnel, collaboration, and geographic location. Long-term cooperation deals were clearly valued as the most significant form of collaboration, and especially in processes connected to raw material procurement. Study results also revealed that factors related to demand, subsidies and prices had highest importance in connection with sawmills´ future bioenergy business. However, majority of the respondents required that certain preconditions connected to the above-mentioned factors should be fulfilled before they will continue their bioenergy-related investments. Generally, the answers showed a wide divergence of opinions among the respondents which may refer to sawmills´ different emphases and expectations concerning bioenergy. In other words, bioenergy is still perceived as a quite novel and risky area of business at Finnish independent industrial sawmills. These results indicate that the massive expansion of bioenergy business at private sawmills in Finland is not a self-evident truth. The blocking barriers seem to be connected mainly to demand of bioenergy and money. Respondents´ answers disseminated a growing dissatisfaction towards the policies of authorities, which don´t treat equally sawmill-based bioenergy compared to other forms of bioenergy. This proposition was boiled down in a sawmill manager´s comment: “There is a lot of bioenergy available, if they just want to make use of it.” It seems that the positive effects of government´s policies favouring the renewables are not taking effect at private sawmills. However, as there anyway seems to be a lot of potential connected to emerging bioenergy business at Finnish independent industrial sawmills, there is also a clear need for more profound future studies over this topic.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The coherence of the Soviet bloc was seriously tested at the turn of the 1970s, as the Soviet Union and its allies engaged in intensive negotiations over their relations with the European Communities (EC). In an effort to secure their own national economic interests many East European countries began independent manoeuvres against the wishes of their bloc leader. However, much of the intra-bloc controversy was kept out of the public eye, as the battle largely took place behind the scenes, within the organisation for economic cooperation, the Council for Mutual Economic Assistance (CMEA). The CMEA policy-making process vis-à-vis the EC is described in this study with reference to primary archival materials. This study investigates the negotiating positions and powers of the CMEA member states in their efforts to deal with the economic challenge created by the progress of the EC, as it advanced towards the customs union. This entails an analysis of the functioning principles and performance of the CMEA machinery. The study traces the CMEA negotiations that began in 1970 over its policy toward the EC. The policy was finally adopted in 1974, and was followed by the first official meeting between the two organisations in early 1975. The story ends in 1976, when the CMEA s efforts to enter into working relations with the EC were seemingly frustrated by the latter. The first major finding of the study is that, contrary to much of the prior research, the Soviet Union was not in a hegemonic position vis-à-vis its allies. It had to use a lot of its resources to tame the independent manoeuvring of its smaller allies. Thus, the USSR was not the kind of bloc leader that the totalitarian literature has described. Because the Soviet Union had to spend so much attention on its own bloc-politics, it was not able to concentrate on formulating a policy vis-à-vis the EC. Thus, the Soviet leadership was dependent on its allies in those instances when the socialist countries needed to act as a bloc. This consequently opened up the possibility for the USSR s allies to manoeuvre. This study also argues that when the CMEA did manage to find a united position, it was a force that the EC had to reckon with in its policy-making. This was particularly the case in the implementation of the EC Common Commercial Policy. The other main finding of the study is that, although it has been largely neglected in the previous literature on the history of West European integration, the CMEA did in fact have an effect on EC decision-making. This study shows how for political and ideological reasons the CMEA members did not acknowledge the EC s supranational authority. Therefore the EC had no choice but to refrain from implementing its Common Commercial Policy in full.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software transactional memory (STM) is a promising programming paradigm for shared memory multithreaded programs as an alternative to traditional lock based synchronization. However adoption of STM in mainstream software has been quite low due to its considerable overheads and its poor cache/memory performance. In this paper, we perform a detailed study of the cache behavior of STM applications and quantify the impact of different STM factors on the cache misses experienced by the applications. Based on our analysis, we propose a compiler driven Lock-Data Colocation (LDC), targeted at reducing the cache overheads on STM. We show that LDC is effective in improving the cache behavior of STM applications by reducing the dcache miss latency and improving execution time performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, we propose a new organization for the last level shared cache of a rnulticore system. Our design is based on the observation that the Next-Use distance, measured in terms of intervening misses between the eviction of a line and its next use, for lines brought in by a given delinquent PC falls within a predictable range of values. We exploit this correlation to improve the performance of shared caches in multi-core architectures by proposing the NUcache organization.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Heat shock protein 90 participates in diverse biological processes ranging from protein folding, cell cycle, signal transduction and development to evolution in all eukaryotes. It is also critically involved in regulating growth of protozoa such as Dictyostelium discoideum, Leishmania donovani, Plasmodium falciparum, Trypanosoma cruzi, and Trypanosoma evansi. Selective inhibition of Hsp90 has also been explored as an intervention strategy against important human diseases such as cancer, malaria, or trypanosomiasis. Giardia lamblia, a simple protozoan parasite of humans and animals, is an important cause of diarrheal disease with significant morbidity and some mortality in tropical countries. Here we show that the G. lamblia cytosolic hsp90 ( glhsp90) is split in two similar sized fragments located 777 kb apart on the same scaffold. Intrigued by this unique arrangement, which appears to be specific for the Giardiinae, we have investigated the biosynthesis of GlHsp90. We used genome sequencing to confirm the split nature of the giardial hsp90. However, a specific antibody raised against the peptide detected a product with a mass of about 80 kDa, suggesting a post-transcriptional rescue of the genomic defect. We show evidence for the joining of the two independent Hsp90 transcripts in-trans to one long mature mRNA presumably by RNA splicing. The splicing junction carries hallmarks of classical cis-spliced introns, suggesting that the regular cis-splicing machinery may be sufficient for repair of the open reading frame. A complementary 26-nt sequence in the ``intron'' regions adjacent to the splice sites may assist in positioning the two pre-mRNAs for processing. This is the first example of post-transcriptional rescue of a split gene by trans-splicing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Design, fabrication and preliminary testing of a flat pump with millimetre thickness are described in this paper. The pump is entirely made of polymer materials barring the magnet and copper coils used for electromagnetic actuation. The fabrication is carried out using widely available microelectronic packaging machinery and techniques. Therefore, the fabrication of the pump is straightforward and inexpensive. Two types of prototypes are designed and built. One consists of copper coils that are etched on an epoxy plate and the other has wound insulated wire of 90 mu m diameter to serve as a coil. The overall size of the first pump is 25 mm x 25 mm x 3.6 mm including the 3.1 mm-thick NdFeB magnet of diameter 12 mm. It consists of a pump chamber of 20 mm x 20 mm x 0.8 mm with copper coils etched from a copper-clad epoxy plate using dry-film lithography and milled using a CNC milling machine, two passive valves and the pump-diaphragm made of Kapton film of 0.089 mm thickness. The second pump has an overall size of 35 mm x 35 mm x 4.4 mm including the magnet and the windings. A breadboard circuit and DC power supply are used to test the pump by applying an alternating square-wave voltage pulse. A water slug in a tube attached to the inlet is used to observe and measure the air-flow induced by the pump against atmospheric pressure. The maximum flow rate was found to be 15 ml/min for a voltage of 2.5 V and a current of 19 mA at 68 Hz.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We develop four algorithms for simulation-based optimization under multiple inequality constraints. Both the cost and the constraint functions are considered to be long-run averages of certain state-dependent single-stage functions. We pose the problem in the simulation optimization framework by using the Lagrange multiplier method. Two of our algorithms estimate only the gradient of the Lagrangian, while the other two estimate both the gradient and the Hessian of it. In the process, we also develop various new estimators for the gradient and Hessian. All our algorithms use two simulations each. Two of these algorithms are based on the smoothed functional (SF) technique, while the other two are based on the simultaneous perturbation stochastic approximation (SPSA) method. We prove the convergence of our algorithms and show numerical experiments on a setting involving an open Jackson network. The Newton-based SF algorithm is seen to show the best overall performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An (alpha, beta)-spanner of an unweighted graph G is a subgraph H that distorts distances in G up to a multiplicative factor of a and an additive term beta. It is well known that any graph contains a (multiplicative) (2k - 1, 0)-spanner of size O(n(1+1/k)) and an (additive) (1, 2)-spanner of size O(n(3/2)). However no other additive spanners are known to exist. In this article we develop a couple of new techniques for constructing (alpha, beta)-spanners. Our first result is an additive (1, 6)-spanner of size O(n(4/3)). The construction algorithm can be understood as an economical agent that assigns costs and values to paths in the graph, purchasing affordable paths and ignoring expensive ones, which are intuitively well approximated by paths already purchased. We show that this path buying algorithm can be parameterized in different ways to yield other sparseness-distortion tradeoffs. Our second result addresses the problem of which (alpha, beta)-spanners can be computed efficiently, ideally in linear time. We show that, for any k, a (k, k - 1)-spanner with size O(kn(1+1/k)) can be found in linear time, and, further, that in a distributed network the algorithm terminates in a constant number of rounds. Previous spanner constructions with similar performance had roughly twice the multiplicative distortion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We review the current status of various aspects of biopolymer translocation through nanopores and the challenges and opportunities it offers. Much of the interest generated by nanopores arises from their potential application to third-generation cheap and fast genome sequencing. Although the ultimate goal of single-nucleotide identification has not yet been reached, great advances have been made both from a fundamental and an applied point of view, particularly in controlling the translocation time, fabricating various kinds of synthetic pores or genetically engineering protein nanopores with tailored properties, and in devising methods (used separately or in combination) aimed at discriminating nucleotides based either on ionic or transverse electron currents, optical readout signatures, or on the capabilities of the cellular machinery. Recently, exciting new applications have emerged, for the detection of specific proteins and toxins (stochastic biosensors), and for the study of protein folding pathways and binding constants of protein-protein and protein-DNA complexes. The combined use of nanopores and advanced micromanipulation techniques involving optical/magnetic tweezers with high spatial resolution offers unique opportunities for improving the basic understanding of the physical behavior of biomolecules in confined geometries, with implications for the control of crucial biological processes such as protein import and protein denaturation. We highlight the key works in these areas along with future prospects. Finally, we review theoretical and simulation studies aimed at improving fundamental understanding of the complex microscopic mechanisms involved in the translocation process. Such understanding is a pre-requisite to fruitful application of nanopore technology in high-throughput devices for molecular biomedical diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SecB is a homotetrameric cytosolic chaperone that forms part of the protein translocation machinery in E. coli. Due to SecB, nascent polypeptides are maintained in an unfolded translocation-competent state devoid of tertiary structure and thus are guided to the translocon. In vitro SecB rapidly binds to a variety of ligands in a non-native state. We have previously investigated the bound state conformation of the model substrate bovine pancreatic trypsin inhibitor (BPTI) as well as the conformation of SecB itself by using proximity relationships based on site-directed spin labeling and pyrene fluorescence methods. It was shown that SecB undergoes a conformational change during the process of substrate binding. Here, we generated SecB mutants containing but a single cysteine per subunit or an exposed highly reactive new cysteine after removal of the nearby intrinsic cysteines. Quantitative spin labeling was achieved with the methanethiosulfonate spin label (MTS) at positions C97 or E90C, respectively. Highfield (W-band) electron paramagnetic resonance (EPR) measurements revealed that with BPTI present the spin labels are exposed to a more polar/hydrophilic environment. Nanoscale distance measurements with double electron-electron resonance (DEER) were in excellent agreement with distances obtained by molecular modeling. Binding of BPTI also led to a slight change in distances between labels at C97 but not at E90C. While the shorter distance in the tetramer increased, the larger diagonal distance decreased. These findings can be explained by a widening of the tetrameric structure upon substrate binding much like the opening of two pairs of scissors.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

MATLAB is an array language, initially popular for rapid prototyping, but is now being increasingly used to develop production code for numerical and scientific applications. Typical MATLAB programs have abundant data parallelism. These programs also have control flow dominated scalar regions that have an impact on the program's execution time. Today's computer systems have tremendous computing power in the form of traditional CPU cores and throughput oriented accelerators such as graphics processing units(GPUs). Thus, an approach that maps the control flow dominated regions to the CPU and the data parallel regions to the GPU can significantly improve program performance. In this paper, we present the design and implementation of MEGHA, a compiler that automatically compiles MATLAB programs to enable synergistic execution on heterogeneous processors. Our solution is fully automated and does not require programmer input for identifying data parallel regions. We propose a set of compiler optimizations tailored for MATLAB. Our compiler identifies data parallel regions of the program and composes them into kernels. The problem of combining statements into kernels is formulated as a constrained graph clustering problem. Heuristics are presented to map identified kernels to either the CPU or GPU so that kernel execution on the CPU and the GPU happens synergistically and the amount of data transfer needed is minimized. In order to ensure required data movement for dependencies across basic blocks, we propose a data flow analysis and edge splitting strategy. Thus our compiler automatically handles composition of kernels, mapping of kernels to CPU and GPU, scheduling and insertion of required data transfer. The proposed compiler was implemented and experimental evaluation using a set of MATLAB benchmarks shows that our approach achieves a geometric mean speedup of 19.8X for data parallel benchmarks over native execution of MATLAB.