980 resultados para Open Journal Systems


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Anyone who looks at the title of this special issue will agree that the intent behind the preparation of this volume was ambitious: to predict and discuss “The Future of Manufacturing”. Will manufacturing be important in the future? Even though some sceptics might say not, and put on the table some old familiar arguments, we would strongly disagree. To bring subsidies for the argument we issued the call-for-papers for this special issue of Journal of Manufacturing Technology Management, fully aware of the size of the challenge in our hands. But we strongly believed that the enterprise would be worthwhile. The point of departure is the ongoing debate concerning the meaning and content of manufacturing. The easily visualised internal activity of using tangible resources to make physical products in factories is no longer a viable way to characterise manufacturing. It is now a more loosely defined concept concerning the organisation and management of open, interdependent, systems for delivering goods and services, tangible and intangible, to diverse types of markets. Interestingly, Wickham Skinner is the most cited author in this special issue of JMTM. He provides the departure point of several articles because his vision and insights have guided and inspired researchers in production and operations management from the late 1960s until today. However, the picture that we draw after looking at the contributions in this special issue is intrinsically distinct, much more dynamic, and complex. Seven articles address the following research themes: 1.new patterns of organisation, where the boundaries of firms become blurred and the role of the firm in the production system as well as that of manufacturing within the firm become contingent; 2.new approaches to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface; 3.new challenges in strategic and operational decisions due to changes in the profile of the workforce; 4.new global players, especially China, modifying the manufacturing landscape; and 5.new techniques, methods and tools that are being made feasible through progress in new technological domains. Of course, many other important dimensions could be studied, but these themes are representative of current changes and future challenges. Three articles look at the first theme: organisational evolution of production and operations in firms and networks. Karlsson's and Skold's article represent one further step in their efforts to characterise “the extraprise”. In the article, they advance the construction of a new framework, based on “the network perspective” by defining the formal elements which compose it and exploring the meaning of different types of relationships. The way in which “actors, resources and activities” are conceptualised extends the existing boundaries of analytical thinking in operations management and open new avenues for research, teaching and practice. The higher level of abstraction, an intrinsic feature of the framework, is associated to the increasing degree of complexity that characterises decisions related to strategy and implementation in the manufacturing and operations area, a feature that is expected to become more and more pervasive as time proceeds. Riis, Johansen, Englyst and Sorensen have also based their article on their previous work, which in this case is on “the interactive firm”. They advance new propositions on strategic roles of manufacturing and discuss why the configuration of strategic manufacturing roles, at the level of the network, will become a key issue and how the indirect strategic roles of manufacturing will become increasingly important. Additionally, by considering that value chains will become value webs, they predict that shifts in strategic manufacturing roles will look like a sequence of moves similar to a game of chess. Then, lastly under the first theme, Fleury and Fleury develop a conceptual framework for the study of production systems in general derived from field research in the telecommunications industry, here considered a prototype of the coming information society and knowledge economy. They propose a new typology of firms which, on certain dimensions, complements the propositions found in the other two articles. Their telecoms-based framework (TbF) comprises six types of companies characterised by distinct profiles of organisational competences, which interact according to specific patterns of relationships, thus creating distinct configurations of production networks. The second theme is addressed by Kyläheiko and SandstroÍm in their article “Strategic options based framework for management of dynamic capabilities in manufacturing firms”. They propose a new approach to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface. Their framework for a manufacturing firm in the digital age leads to active asset selection (strategic investments in both tangible and intangible assets) and efficient orchestrating of the global value net in “thin” intangible asset markets. The framework consists of five steps based on Porter's five-forces model, the resources-based view, complemented by means of the concepts of strategic options and related flexibility issues. Thun, GroÍssler and Miczka's contribution to the third theme brings the human dimension to the debate regarding the future of manufacturing. Their article focuses on the challenges brought to management by the ageing of workers in Germany but, in the arguments that are raised, the future challenges associated to workers and work organisation in every production system become visible and relevant. An interesting point in the approach adopted by the authors is that not only the factual problems and solutions are taken into account but the perception of the managers is brought into the picture. China cannot be absent in the discussion of the future of manufacturing. Therefore, within the fourth theme, Vaidya, Bennett and Liu provide the evidence of the gradual improvement of Chinese companies in the medium and high-tech sectors, by using the revealed comparative advantage (RCA) analysis. The Chinese evolution is shown to be based on capabilities developed through combining international technology transfer and indigenous learning. The main implication for the Western companies is the need to take account of the accelerated rhythm of capability development in China. For other developing countries China's case provides lessons of great importance. Finally, under the fifth theme, Kuehnle's article: “Post mass production paradigm (PMPP) trajectories” provides a futuristic scenario of what is already around us and might become prevalent in the future. It takes a very intensive look at a whole set of dimensions that are affecting manufacturing now, and will influence manufacturing in the future, ranging from the application of ICT to the need for social transparency. In summary, this special issue of JMTM presents a brief, but undisputable, demonstration of the possible richness of manufacturing in the future. Indeed, we could even say that manufacturing has no future if we only stick to the past perspectives. Embracing the new is not easy. The new configurations of production systems, the distributed and complementary roles to be performed by distinct types of companies in diversified networked structures, leveraged by the new emergent technologies and associated the new challenges for managing people, are all themes that are carriers of the future. The Guest Editors of this special issue on the future of manufacturing are strongly convinced that their undertaking has been worthwhile.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the past few years, logging has evolved from from simple printf statements to more complex and widely used logging libraries. Today logging information is used to support various development activities such as fixing bugs, analyzing the results of load tests, monitoring performance and transferring knowledge. Recent research has examined how to improve logging practices by informing developers what to log and where to log. Furthermore, the strong dependence on logging has led to the development of logging libraries that have reduced the intricacies of logging, which has resulted in an abundance of log information. Two recent challenges have emerged as modern software systems start to treat logging as a core aspect of their software. In particular, 1) infrastructural challenges have emerged due to the plethora of logging libraries available today and 2) processing challenges have emerged due to the large number of log processing tools that ingest logs and produce useful information from them. In this thesis, we explore these two challenges. We first explore the infrastructural challenges that arise due to the plethora of logging libraries available today. As systems evolve, their logging infrastructure has to evolve (commonly this is done by migrating to new logging libraries). We explore logging library migrations within Apache Software Foundation (ASF) projects. We i find that close to 14% of the pro jects within the ASF migrate their logging libraries at least once. For processing challenges, we explore the different factors which can affect the likelihood of a logging statement changing in the future in four open source systems namely ActiveMQ, Camel, Cloudstack and Liferay. Such changes are likely to negatively impact the log processing tools that must be updated to accommodate such changes. We find that 20%-45% of the logging statements within the four systems are changed at least once. We construct random forest classifiers and Cox models to determine the likelihood of both just-introduced and long-lived logging statements changing in the future. We find that file ownership, developer experience, log density and SLOC are important factors in determining the stability of logging statements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this paper is to develop models for experimental open-channel water delivery systems and assess the use of three data-driven modeling tools toward that end. Water delivery canals are nonlinear dynamical systems and thus should be modeled to meet given operational requirements while capturing all relevant dynamics, including transport delays. Typically, the derivation of first principle models for open-channel systems is based on the use of Saint-Venant equations for shallow water, which is a time-consuming task and demands for specific expertise. The present paper proposes and assesses the use of three data-driven modeling tools: artificial neural networks, composite local linear models and fuzzy systems. The canal from Hydraulics and Canal Control Nucleus (A parts per thousand vora University, Portugal) will be used as a benchmark: The models are identified using data collected from the experimental facility, and then their performances are assessed based on suitable validation criterion. The performance of all models is compared among each other and against the experimental data to show the effectiveness of such tools to capture all significant dynamics within the canal system and, therefore, provide accurate nonlinear models that can be used for simulation or control. The models are available upon request to the authors.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Com o envelhecimento da população, as preocupações com a garantia do seu bem-estar aumentam criando a necessidade de desenvolver ferramentas que permitam monitorizar em permanência este sector da população. A utilização de smartphones pelos mais velhos pode ser crucial no seu bem-estar e na sua autonomia contribuindo para a recolha de informação importante já que estes estão muitas vezes equipados com sensores que podem dar indicações preciosas ao cuidador sobre o estado atual do paciente. Os sensores podem fornecer dados sobre a atividade física do paciente, bem como detetar quedas ou calcular a sua posição, com a ajuda do acelerómetro, do giroscópio e do sensor de campo magnético. No entanto, funcionalidades como essas requerem, obrigatoriamente, uma frequência de amostragem mínima por parte dos sensores que permita a implementação de algoritmos, que determinarão esses parâmetros da forma mais exata possível. Dado que nem sempre os pacientes se fazem acompanhar do seu smartphone quando estão na sua residência, a criação de ambientes de AAL (Ambient Assisted Living) com recurso a dispositivos externos que podem ser “vestidos” pelos pacientes pode também ser uma solução adequada. Estes contêm normalmente os mesmos sensores que os smartphones e comunicam com estes através de tecnologias sem fios, como é o caso do Bluetooth Low Energy. Neste trabalho, avaliou-se a possibilidade de alteração da frequência dos sensores em diferentes sistemas operativos, tendo sido efectuadas modificações nas instalações por defeito de alguns sistemas operativos abertos. Com o objectivo de permitir a criação de uma solução de AAL com recurso a um dispositivo externo implementaram-se serviços e perfis num dispositivo externo, o SensorTag.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Following targeted advertising and product placement, TV and online media needs more personalised methods of engaging viewers by integrating advertising and informational messages into playout content, whether real-time broadcast or on-demand. Future advertising solutions need adaptivity to individuals or on-line groups to respond to the commercial requirements of clients and agencies.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We report two unusual cases of sudden unexpected death in children. Histopathologic examination showed intimal fibroplasias, a variant of fibromuscular dysplasia of the right coronary artery associated in both cases with fatty infiltration of the right ventricular myocardium. The significance of this particular combination of two lesions known to induce a sudden death in young people is discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper studies the effects of financial liberalization and banking crises on growth. It shows that financial liberalization spurs on average economic growth. Banking crises are harmful for growth, but to a lesser extent in countries with open financial systems and good institutions. The positive effect of financial liberalization is robust to different definitions. While the removal of capital account restrictions is effective by increasing financial depth, equity market liberalization affects growth directly. The empirical analysis is performed through GMM dynamic panel data estimations on a panel of 90 countries observed in the period 1975-1999.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: Previous published studies have shown significant variations in colonoscopy performance, even when medical factors are taken into account. This study aimed to examine the role of nonmedical factors (ie, embodied in health care system design) as possible contributors to variations in colonoscopy performance. METHODS: Patient data from a multicenter observational study conducted between 2000 and 2002 in 21 centers in 11 western countries were used. Variability was captured through 2 performance outcomes (diagnostic yield and colonoscopy withdrawal time), jointly studied as dependent variables, using a multilevel 2-equation system. RESULTS: Results showed that open-access systems and high-volume colonoscopy centers were independently associated with a higher likelihood of detecting significant lesions and longer withdrawal durations. Fee for service (FFS) payment was associated with shorter withdrawal durations, and so had an indirect negative impact on the diagnostic yield. Teaching centers exhibited lower detection rates and longer withdrawal times. CONCLUSIONS: Our results suggest that gatekeeping colonoscopy is likely to miss patients with significant lesions and that developing specialized colonoscopy units is important to improve performance. Results also suggest that FFS may result in a lower quality of care in colonoscopy practice and highlight the fact that longer withdrawal times do not necessarily indicate higher quality in teaching centers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An old erg covers the northern part of the Lake Chad basin. This dune landform allowed the formation of many inter- dune ponds of various sizes. Still present in certain zones where the groundwater level is high (e.g. Kanem, southern Manga), these ponds formed in the past a vast network of lacustrine microsystems, as shown by the nature and the dis- tribution of their deposits. In the Manga, these interdune deposits represent the main sedimentary records of the Holo- cene environmental succession. Their paleobiological (pollens, diatoms, ostracods) and geochemical (δ18O, δ13C, Sr/ Ca) contents are often the basis for paleoenvironmental reconstruction. On the other hand, their sedimentological char- acters are rarely exploited. This study of palustro-lacustrine deposits of the Holocene N'Guigmi lake (northern bank of the Lake Chad; Niger) is based on the relationships between the sedimentological features and the climato-hydrological fluctuations. The mineralogical parameters (e.g. calcium carbonate content, clay mineralogy) and the nature of autoch- thonous mineralization (i.e. amorphous silica, clays, calcium carbonates) can be interpreted using a straightforward hy- dro-sedimentary model. Established to explain the geochemical dynamics of Lake Chad, this model is based on a bio- geochemical cycle of the main elements (i.e. silicium, calcium) directly controlled by the local hydrological balance (i.e. rainfall/evaporation ratio). All these results show that a detailed study of sedimentological features can provide impor- tant paleohydrological informations about the regional aridification since ca 6500 14C BP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims: To evaluate the effectiveness and safety of Posterior Sub-Tenon (PST) Triamcinolone Acetonide (TA) injection for persistent macular oedema associated with non-ischemic Central Retinal Vein Occlusion (CRVO) or Branch Retinal Vein Occlusion (BRVO) in non-vitrectomized eye. Methods: Fourteen consecutive eyes of 14 patients characterized by macular oedema lasting more than 3 months and with a visual acuity of less than 20/40 were enrolled. Six eyes presented with BRVO, 8 eyes with CRVO. PST injection of 40 mg TA was performed in topical anaesthesia. All patients were phakic, and followed for at least 6 months. Snellen visual acuity converted to LogMAR units and anatomic responses were evaluated before, and at 1, 3, 6, and 12 (if required) months after injections and re-injection considered. Results: In the BRVO group, mean foveal thickness was 548.2±49.50 μm preoperatively, and 452.8±56.2 μm and 280.8±62.5 μm at 1 and 12 month follow-up, respectively. Statistical analysis showed significant differences between preoperative and postoperative measurements (P<.05, paired t test) 3 months after injections. Improvement of visual acuity by at least 0.2 LogMAR was seen in 3(50%) of the 6 eyes. No re-injection was needed. In the CRVO group, mean foveal thickness was 543.7±34.4 μm preoperatively, and 283.0±29.0 μm and 234.8±23.6 μm at 1 and 12 month follow-up, respectively. Statistical analysis showed significant differences between preoperative and postoperative measurements (P<.05, paired t test). Improvement of visual acuity by at least 0.2 LogMAR was seen in 7 eyes (88%). Mean number of re-injection was of 2.1±0.3. Intraocular pressure elevation of 22 mm Hg or higher was found in 2/14 eyes (14%). Cataract progression was noted in 5/14 eyes (36%). Conclusions: PST injection of TA appears to be as safe and effective treatment for chronic macular oedema associated due to both non-ischemic BRVO or CRVO, with a better efficacy in BRVO.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this Thesis the interaction of an electromagnetic field and matter is studied from various aspects in the general framework of cold atoms. Our subjects cover a wide spectrum of phenomena ranging from semiclassical few-level models to fully quantum mechanical interaction with structured reservoirs leading to non-Markovian open quantum system dynamics. Within closed quantum systems, we propose a selective method to manipulate the motional state of atoms in a time-dependent double-well potential and interpret the method in terms of adiabatic processes. Also, we derive a simple wave-packet model, based on distributions of generalized eigenstates, explaining the finite visibility of interference in overlapping continuous-wave atom lasers. In the context of open quantum systems, we develop an unraveling of non-Markovian dynamics in terms of piecewise deterministic quantum jump processes confined in the Hilbert space of the reduced system - the non-Markovian quantum jump method. As examples, we apply it for simple 2- and 3-level systems interacting with a structured reservoir. Also, in the context of ion-cavity QED we study the entanglement generation based on collective Dicke modes in experimentally realistic conditions including photonic losses and an atomic spontaneous decay.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Teollisuuden jäähdytysjärjestelmiä tarvitaan prosessien lämpötilan ja paineen hal-litsemiseen. Vesi on käytetyin lämmönsiirtoaine hyvän saatavuutensa, halvan hin-nan ja korkean lämmönsiirtokyvyn ansiosta. Jäähdytysjärjestelmät jaetaan kolmeen päätyyppiin, joita ovat läpivirtausjäähdytys, suljettu ja avoin kiertojäähdytys. Kullakin järjestelmätyypillä on tyypilliset alatyyppinsä. Avoimella kiertojär-jestelmällä on eniten alatyyppejä, joista yleisin on jäähdytystorni. Jäähdytystorneja on kolmea tyyppiä: märkä-, kuiva ja hybriditorni. Kullakin järjestelmätyypillä on ominaiset piirteensä käyttökohteiden, ympäristövaikutusten, ohjattavuuden, investointi- ja käyttökulujen suhteen, joita tässä työssä esitellään. Työssä tutkitaan teollisuuden jäähdytysjärjestelmien esittelyn lisäksi erään ali-painekaasunpoistimen soveltuvuutta suljetun kiertojäähdytysjärjestelmän kaasun-poistoon. Suljettuun kiertojäähdytysjärjestelmään jää ilmaa täyttövaiheessa ja kul-keutuu liuenneena käytettävän jäähdytysveden mukana. Muodostuva ylikylläinen seos synnyttää veden sekaan ilmakuplia, jotka aiheuttavat korroosiota kemiallisesti ja kuluttamalla. Lisäksi kaasukuplat vievät tilavuutta nesteeltä. Tämä pienentää järjestelmän jäähdytystehoa merkittävästi, koska kaasun lämmönsiirtokyky verrat-tuna veden lämmönsiirtokykyyn on pieni. Työssä esitellään myös muita mahdolli-sia suljetun järjestelmän kaasulähteitä ja niiden aiheuttamia ongelmia. Alipainekaasunpoistimen kaasunerotustehokkuutta mitattiin jäähdytysvesinäyttei-den selkeytymisnopeudella ja lämmönsiirtimien tehon paranemisella. Kahden viikon tarkastelujaksolla selkeytymisajat paranivat 36–60 % eri mittauspaikoissa ja lämmönsiirtimien tehot paranivat 6–29 %. Järjestelmään kuitenkin jäi merkittävä määrä kaasua, vaikka laitteen käyttöä jatkettiin tarkastelujakson jälkeen, joten tavoitteisiin ei päästy. Tutkitun alipainekaasunpoistolaitteen ei todettu soveltuvan tehdasympäristöön kestämättömyyden, hankalakäyttöisyyden ja tehottomuuden takia. Tulokset kuitenkin osoittavat, että kaasunerotuksella on merkittävä vaikutus suljetun jäähdytysjärjestelmän toimivuuteen ja saavutettavaan jäähdytystehoon.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Thesis discusses the phenomenology of the dynamics of open quantum systems marked by non-Markovian memory effects. Non-Markovian open quantum systems are the focal point of a flurry of recent research aiming to answer, e.g., the following questions: What is the characteristic trait of non-Markovian dynamical processes that discriminates it from forgetful Markovian dynamics? What is the microscopic origin of memory in quantum dynamics, and how can it be controlled? Does the existence of memory effects open new avenues and enable accomplishments that cannot be achieved with Markovian processes? These questions are addressed in the publications forming the core of this Thesis with case studies of both prototypical and more exotic models of open quantum systems. In the first part of the Thesis several ways of characterizing and quantifying non-Markovian phenomena are introduced. Their differences are then explored using a driven, dissipative qubit model. The second part of the Thesis focuses on the dynamics of a purely dephasing qubit model, which is used to unveil the origin of non-Markovianity for a wide class of dynamical models. The emergence of memory is shown to be strongly intertwined with the structure of the spectral density function, as further demonstrated in a physical realization of the dephasing model using ultracold quantum gases. Finally, as an application of memory effects, it is shown that non- Markovian dynamical processes facilitate a novel phenomenon of timeinvariant discord, where the total quantum correlations of a system are frozen to their initial value. Non-Markovianity can also be exploited in the detection of phase transitions using quantum information probes, as shown using the physically interesting models of the Ising chain in a transverse field and a Coulomb chain undergoing a structural phase transition.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014