905 resultados para New Generation Rollingstock Depot
Resumo:
Innovations diffuse at different speed among the members of a social system through various communication channels. The group of early adopters can be seen as the most influential reference group for majority of people to base their innovation adoption decisions on. Thus, the early adopters can often accelerate the diffusion of innovations. The purpose of this research is to discover means of diffusion for an innovative product in Finnish market through the influential early adopters in respect to the characteristics of the case product. The purpose of the research can be achieved through the following sub objectives: Who are the potential early adopters for the case product and why? How the potential early adopters of the case product should be communicated with? What would be the expectations, preferences, and experiences of the early adopters of the case product? The case product examined in this research is a new board game called Rock Science which is considered to be incremental innovation bringing board gaming and hard rock music together in a new way. The research was conducted in two different parts using both qualitative and quantitative research methods. This mixed method research began with expert interviews of six music industry experts. The information gathered from the interviews enabled researcher to compose the questionnaire for the quantitative part of the study. Internet survey that was sent out resulted with a sample of 97 responses from the targeted population. The key findings of the study suggest that (1) the potential early adopters for the case product are more likely to be young adults from the capital city area with great interest in rock music, (2) the early adopters can be reached effectively through credible online sources of information, and (3) the respondents overall product feedback is highly positive, except in the case of quality-price ratio of the product. This research indicates that more effective diffusion of Rock Science board game in Finland can be reached through (1) strategic alliances with music industry and media partnerships, (2) pricing adjustments, (3) use of supporting game formats, and (4) innovative use of various social media channels.
Resumo:
The incidence of diabetic neuropathy increases with the duration of diabetes and the degree of hyperglycaemia. Pain is one of the most common and incapacitating symptoms of diabetic neuropathy and its pharmacological control is complex. The effectiveness of antidepressive agents has been described in different types of neuropathic pain, but their effectiveness, when used as analgesics in painful diabetic neuropathy, still remains controversial. Objective: To review the possible role of new-generation antidepressive agents in the treatment of pain in diabetic peripheral neuropathy. This work has thus consisted of a meta-analysis for determining which antidepressive agent had the best analgesic potential in managing pain in patients suffering from painful diabetic neuropathy. Methods: This search covered the Cochrane, MEDLINE, EMBASE and LILACS databases, between January 2000 and August 2007. The following information was obtained from each article: criteria for diagnosing diabetic neuropathy, patients' age average, antidepressant drug received and dose, sample size, duration of the disease and treatment follow-up, outcome measurement, evaluation of pain and rescue medication. Results: A combined RR: 1.67 (95% CI 1.38 - 2.02) was obtained; this result indicated that the antidepressive agent duloxetine, was effective for controlling pain in diabetic neuropathy. The corresponding NNT for Duloxetine was established, according to our interests; NNT = 6 (95% CI 5- 8) for achieving greater than 50% analgesia in patients suffering from painful diabetic neuropathy. Discussion: Antidepressive agents are frequently employed in the specific case of diabetic neuropathy; their analgesic benefit has been demonstrated.
Resumo:
Models of root system growth emerged in the early 1970s, and were based on mathematical representations of root length distribution in soil. The last decade has seen the development of more complex architectural models and the use of computer-intensive approaches to study developmental and environmental processes in greater detail. There is a pressing need for predictive technologies that can integrate root system knowledge, scaling from molecular to ensembles of plants. This paper makes the case for more widespread use of simpler models of root systems based on continuous descriptions of their structure. A new theoretical framework is presented that describes the dynamics of root density distributions as a function of individual root developmental parameters such as rates of lateral root initiation, elongation, mortality, and gravitropsm. The simulations resulting from such equations can be performed most efficiently in discretized domains that deform as a result of growth, and that can be used to model the growth of many interacting root systems. The modelling principles described help to bridge the gap between continuum and architectural approaches, and enhance our understanding of the spatial development of root systems. Our simulations suggest that root systems develop in travelling wave patterns of meristems, revealing order in otherwise spatially complex and heterogeneous systems. Such knowledge should assist physiologists and geneticists to appreciate how meristem dynamics contribute to the pattern of growth and functioning of root systems in the field.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.
Resumo:
The contribution is studied of vector boson fusion processes to the production of a heavy charged lepton accompanied by its neutral partner at multi-TeV hadronic colliders. It is shown that vector boson fusion dominates over the usual quark-antiquark fusion mechanism for a very heavy lepton and high energies. © 1986.
Resumo:
Incluye Bibliografía
Resumo:
Includes bibliography
Resumo:
I Nuclei Galattici Attivi (AGN) sono sorgenti luminose e compatte alimentate dall'accrescimento di materia sul buco nero supermassiccio al centro di una galassia. Una frazione di AGN, detta "radio-loud", emette fortemente nel radio grazie a getti relativistici accelerati dal buco nero. I Misaligned AGN (MAGN) sono sorgenti radio-loud il cui getto non è allineato con la nostra linea di vista (radiogalassie e SSRQ). La grande maggioranza delle sorgenti extragalattiche osservate in banda gamma sono blazar, mentre, in particolare in banda TeV, abbiamo solo 4 MAGN osservati. Lo scopo di questa tesi è valutare l'impatto del Cherenkov Telescope Array (CTA), il nuovo strumento TeV, sugli studi di MAGN. Dopo aver studiato le proprietà dei 4 MAGN TeV usando dati MeV-GeV dal telescopio Fermi e dati TeV dalla letteratura, abbiamo assunto come candidati TeV i MAGN osservati da Fermi. Abbiamo quindi simulato 50 ore di osservazioni CTA per ogni sorgente e calcolato la loro significatività. Assumendo una estrapolazione diretta dello spettro Fermi, prevediamo la scoperta di 9 nuovi MAGN TeV con il CTA, tutte sorgenti locali di tipo FR I. Applicando un cutoff esponenziale a 100 GeV, come forma spettrale più realistica secondo i dati osservativi, prevediamo la scoperta di 2-3 nuovi MAGN TeV. Per quanto riguarda l'analisi spettrale con il CTA, secondo i nostri studi sarà possibile ottenere uno spettro per 5 nuove sorgenti con tempi osservativi dell'ordine di 250 ore. In entrambi i casi, i candidati migliori risultano essere sempre sorgenti locali (z<0.1) e con spettro Fermi piatto (Gamma<2.2). La migliore strategia osservativa per ottenere questi risultati non corrisponde con i piani attuali per il CTA che prevedono una survey non puntata, in quanto queste sorgenti sono deboli, e necessitano di lunghe osservazioni puntate per essere rilevate (almeno 50 ore per studi di flusso integrato e 250 per studi spettrali).
Resumo:
HP802-247 is a new-generation, allogeneic tissue engineering product consisting of growth-arrested, human keratinocytes (K) and fibroblasts (F) delivered in a fibrin matrix by a spray device.
Resumo:
Prosthesis-patient mismatch (PPM) remains a controversial issue with the most recent stented biological valves. We analyzed the incidence of PPM after implantation of the Carpentier-Edwards Perimount Magna Ease aortic valve (PMEAV) bioprosthesis and assessed the early clinical outcome. Two hundred and seventy consecutive patients who received a PMEAV bioprosthesis between January 2007 and July 2008 were analyzed. Pre-, peri- and postoperative data were assessed and echocardiographic as well as clinical follow-up was performed. Mean age was 72+/-9 years, 168 (62.2%) were males. Fifty-seven patients (21.1%) were below 65 years of age. Absence of PPM, corresponding to an indexed effective orifice area >0.85 cm(2)/m(2), was 99.5%. Observed in-hospital mortality was 2.2% (six patients), with a predicted mortality according to the additive EuroSCORE of 7.6+/-3.1%. At echocardiographic assessment after a mean follow-up period of 150+/-91 days, mean transvalvular gradient was 11.8+/-4.8 mmHg (all valve sizes). No paravalvular leakage was seen. Nine patients died during follow-up. The Carpentier-Edwards PMEAV bioprosthesis shows excellent hemodynamic performance. This valve can be implanted in all sizes with an incidence of severe PPM below 0.5%.
Resumo:
The aim of this study is to assess the serial changes in strut apposition and coverage of the bioresorbable vascular scaffolds (BVS) and to relate this with the presence of intraluminal masses at 6 months with optical coherence tomography (OCT).
Resumo:
In the RESOLUTE All Comers trial, the Resolute zotarolimus-eluting stent was non-inferior to the Xience V everolimus-eluting stent for the primary stent-related endpoint of target lesion failure (cardiac death, target vessel myocardial infarction, and ischaemia-driven target lesion revascularisation) at 1 year. However, data for long-term safety and efficacy from randomised studies of new generation drug-eluting coronary stents in patients treated in routine clinical practice are scarce. We report the prespecified 2-year clinical outcomes from the RESOLUTE All Comers trial.