950 resultados para Route Guidance and Navigation System
Resumo:
In view of the increasingly complexity of services logic and functional requirements, a new system architecture based on SOA was proposed for the equipment remote monitoring and diagnosis system. According to the design principles of SOA, different levels and different granularities of services logic and functional requirements for remote monitoring and diagnosis system were divided, and a loosely coupled web services system was built. The design and implementation schedule of core function modules for the proposed architecture were presented. A demo system was used to validate the feasibility of the proposed architecture.
Resumo:
Az adócsalásnak egy olyan modellcsaládját vizsgáljuk, ahol az egykulcsos adó kizárólag a közjavakat finanszírozza. Két megközelítés összehasonlítására összpontosítunk. Az elsőben minden dolgozó jövedelme azonos, és ebből minden évben annyit vall be, amennyi maximalizálja a nála maradó jövedelemből fedezhető fogyasztás nyújtotta hasznosság és a jövedelembevallásból fakadó hasznosság összegét. A második hasznosság három tényező szorzata: a dolgozó exogén adómorálja, a környezetében előző évben megfigyelt átlagos jövedelembevallás és saját bevallásából fakadó endogén hasznossága. A második megközelítésben az ágensek egyszerű heurisztikus szabályok szerint cselekszenek. Míg az optimalizáló modellben hagyományos Laffer-görbékkel találkozunk, addig a heurisztikán alapuló modellekben (lineárisan) növekvő Laffer-görbék jönnek létre. E különbség oka, hogy a heurisztikán alapuló modellben egy sajátos viselkedésfajta jelentkezik: számos ágens ingatag helyzetbe kerül, amelyben altruizmus és önzés között ingadozik. ________ The authors study a family of models of tax evasion, where a flat-rate tax only finances the provision of public goods and audits and wage differences are ne-glected. The paper focuses on comparing two modelling approaches. The first is based on optimizing agents, endowed with social preferences, their utility being the sum of private consumption and moral utility. The second approach involves agents acting according to simple heuristics. While the traditionally shaped Laffer curves are encountered in the optimizing model, the heuristics models exhibit (linearly) increasing Laffer curves. This difference is related to a peculiar type of behaviour: within the agent-based approach lurk a number of agents in a moral state of limbo, alternating between altruism and selfishness.
Resumo:
The commercialization of inventions is very complex and challenging therefore it requires the collaboration of several actors in an economy. Even when an invention possesses significant added value, its successful commercialization could only be executed in a stable macroeconomic and innovation environment and also if proper innovation management expertise is provided. ValDeal Innovations Zrt. was established to foster the commercialization of Hungarian, high business potential inventions by providing its business expertise. The company used an – already in various markets and countries probed – US innovation management method consisting of the tasks of technology evaluation as well as the commercialization of inventions. There were major changes necessary while probing the US method residing in the different macroeconomic circumstances and the attitudes for innovation in Hungary. The article details the above mentioned issues together with the conclusions the members of ValDeal have drawn during the innovation management process.
Resumo:
This study investigated the effects of augmented prenatal auditory stimulation on postnatal visual responsivity and neural organization in bobwhite quail (Colinus virginianus). I delivered conspecific embryonic vocalizations before, during, or after the development of a multisensory, midbrain audiovisual area, the optic tectum. Postnatal simultaneous choice tests revealed that hatchlings receiving augmented auditory stimulation during optic tectum development as embryos failed to show species-typical visual preferences for a conspecific maternal hen 72 hours after hatching. Auditory simultaneous choice tests showed no hatchlings had deficits in auditory function in any of the groups, indicating deficits were specific to visual function. ZENK protein expression confirmed differences in the amount of neural plasticity in multiple neuroanatomical regions of birds receiving stimulation during optic tecturn development, compared to unmanipulated birds. The results of these experiments support the notion that the timing of augmented prenatal auditory stimulation relative to optic tectum development can impact postnatal perceptual organization in an enduring way.^
Resumo:
The purpose of this investigation was to develop and implement a general purpose VLSI (Very Large Scale Integration) Test Module based on a FPGA (Field Programmable Gate Array) system to verify the mechanical behavior and performance of MEM sensors, with associated corrective capabilities; and to make use of the evolving System-C, a new open-source HDL (Hardware Description Language), for the design of the FPGA functional units. System-C is becoming widely accepted as a platform for modeling, simulating and implementing systems consisting of both hardware and software components. In this investigation, a Dual-Axis Accelerometer (ADXL202E) and a Temperature Sensor (TMP03) were used for the test module verification. Results of the test module measurement were analyzed for repeatability and reliability, and then compared to the sensor datasheet. Further study ideas were identified based on the study and results analysis. ASIC (Application Specific Integrated Circuit) design concepts were also being pursued.
Resumo:
The author explores the challenges graduate students face preparing for a dissertation through university events. Autoethnography using notes, observation, and journal writing and framed by genre and activity system theory highlight university system conflicts with the culture of the student.
Resumo:
In his dialogue - Near Term Computer Management Strategy For Hospitality Managers and Computer System Vendors - by William O'Brien, Associate Professor, School of Hospitality Management at Florida International University, Associate Professor O’Brien initially states: “The computer revolution has only just begun. Rapid improvement in hardware will continue into the foreseeable future; over the last five years it has set the stage for more significant improvements in software technology still to come. John Naisbitt's information electronics economy¹ based on the creation and distribution of information has already arrived and as computer devices improve, hospitality managers will increasingly do at least a portion of their work with software tools.” At the time of this writing Assistant Professor O’Brien will have you know, contrary to what some people might think, the computer revolution is not over, it’s just beginning; it’s just an embryo. Computer technology will only continue to develop and expand, says O’Brien with citation. “A complacent few of us who feel “we have survived the computer revolution” will miss opportunities as a new wave of technology moves through the hospitality industry,” says ‘Professor O’Brien. “Both managers who buy technology and vendors who sell it can profit from strategy based on understanding the wave of technological innovation,” is his informed opinion. Property managers who embrace rather than eschew innovation, in this case computer technology, will benefit greatly from this new science in hospitality management, O’Brien says. “The manager who is not alert to or misunderstands the nature of this wave of innovation will be the constant victim of technology,” he advises. On the vendor side of the equation, O’Brien observes, “Computer-wise hospitality managers want systems which are easier and more profitable to operate. Some view their own industry as being somewhat behind the times… They plan to pay significantly less for better computer devices. Their high expectations are fed by vendor marketing efforts…” he says. O’Brien warns against taking a gamble on a risky computer system by falling victim to un-substantiated claims and pie-in-the-sky promises. He recommends affiliating with turn-key vendors who provide hardware, software, and training, or soliciting the help of large mainstream vendors such as IBM, NCR, or Apple. Many experts agree that the computer revolution has merely and genuinely morphed into the software revolution, informs O’Brien; “…recognizing that a computer is nothing but a box in which programs run.” Yes, some of the empirical data in this article is dated by now, but the core philosophy of advancing technology, and properties continually tapping current knowledge is sound.
Resumo:
Neural crest cells originate from the dorsal most region of the embryonic neural tube. These cells migrate into several embryonic locations and differentiate into a variety of cell types. Cardiac neural crest (CNC) cells are a set of neural crest progenitors that aid in the proper formation of the cardiac septum, which separates the pulmonary from the systemic circulation. We have used Splotch mice to investigate whether the murine CNC cells play a role during the development oft he myocardium and the conduction system. Splotch mice carry a mutation in the P AX3 transcription factor, and display a problem in CNC cell migration. A scanning-electron-microscopy analysis of Splotch mutant-embryonic-hearts reveals abnormalities in the interventricular septum. In addition, the right and left ventricular cavities appear dilated relative to a wild type heart. Hoechst nuclei staining of Splotch heart cryosections demonstrates a decreased number of cardiomyocytes and a corresponding thinner ventricular wall. The absence of Connexin 40 in the ventricles of Splotch mutants, suggests conduction system defects. These results support the evidence that CNC cell signaling plays a role in modulating the growth and development of murine cardiomyocytes and their differentiation into conductile cells.
Resumo:
This research has explored the relationship between system test complexity and tacit knowledge. It is proposed as part of this thesis, that the process of system testing (comprising of test planning, test development, test execution, test fault analysis, test measurement, and case management), is directly affected by both complexity associated with the system under test, and also by other sources of complexity, independent of the system under test, but related to the wider process of system testing. While a certain amount of knowledge related to the system under test is inherent, tacit in nature, and therefore difficult to make explicit, it has been found that a significant amount of knowledge relating to these other sources of complexity, can indeed be made explicit. While the importance of explicit knowledge has been reinforced by this research, there has been a lack of evidence to suggest that the availability of tacit knowledge to a test team is of any less importance to the process of system testing, when operating in a traditional software development environment. The sentiment was commonly expressed by participants, that even though a considerable amount of explicit knowledge relating to the system is freely available, that a good deal of knowledge relating to the system under test, which is demanded for effective system testing, is actually tacit in nature (approximately 60% of participants operating in a traditional development environment, and 60% of participants operating in an agile development environment, expressed similar sentiments). To cater for the availability of tacit knowledge relating to the system under test, and indeed, both explicit and tacit knowledge required by system testing in general, an appropriate knowledge management structure needs to be in place. This would appear to be required, irrespective of the employed development methodology.
Resumo:
In this podcast Roberta Heale talks to Dr Peter O'Halloran about the paper "After the Liverpool Care Pathway clear guidance and support on end-of-life care is needed." They discuss the newly implemented pathways and the effects these have on practice and patients.
Resumo:
Executive summary
Digital systems have transformed, and will continue to transform, our world. Supportive government policy, a strong research base and a history of industrial success make the UK particularly well-placed to realise the benefits of the emerging digital society. These benefits have already been substantial, but they remain at risk. Protecting the benefits and minimising the risks requires reliable and robust cybersecurity, underpinned by a strong research and translation system.
Trust is essential for growing and maintaining participation in the digital society. Organisations earn trust by acting in a trustworthy manner: building systems that are reliable and secure, treating people, their privacy and their data with respect, and providing credible and comprehensible information to help people understand how secure they are.
Resilience, the ability to function, adapt, grow, learn and transform under stress or in the face of shocks, will help organisations deliver systems that are reliable and secure. Resilient organisations can better protect their customers, provide more useful products and services, and earn people’s trust.
Research and innovation in industry and academia will continue to make important contributions to creating this resilient and trusted digital environment. Research can illuminate how best to build, assess and improve digital systems, integrating insights from different disciplines, sectors and around the globe. It can also generate advances to help cybersecurity keep up with the continued evolution of cyber risks.
Translation of innovative ideas and approaches from research will create a strong supply of reliable, proven solutions to difficult to predict cybersecurity risks. This is best achieved by maximising the diversity and number of innovations that see the light of day as products.
Policy, practice and research will all need to adapt. The recommendations made in this report seek to set up a trustworthy, self-improving and resilient digital environment that can thrive in the face of unanticipated threats, and earn the trust people place in it.
Innovation and research will be particularly important to the UK’s economy as it establishes a new relationship with the EU. Cybersecurity delivers important economic benefits, both by underpinning the digital foundations of UK business and trade and also through innovation that feeds directly into growth. The findings of this report will be relevant regardless of how the UK’s relationship to the EU changes.
Headline recommendations
● Trust: Governments must commit to preserving the robustness of encryption, including end-to-end encryption, and promoting its widespread use. Encryption is a foundational security technology that is needed to build user trust, improve security standards and fully realise the benefits of digital systems.
● Resilience: Government should commission an independent review of the UK’s future cybersecurity needs, focused on the institutional structures needed to support resilient and trustworthy digital systems in the medium and longer term. A self-improving, resilient digital environment will need to be guided and governed by institutions that are transparent, expert and have a clear and widely-understood remit.
● Research: A step change in cybersecurity research and practice should be pursued; it will require a new approach to research, focused on identifying ambitious high-level goals and enabling excellent researchers to pursue those ambitions. This would build on the UK's existing strengths in many aspects of cybersecurity research and ultimately help build a resilient and trusted digital sector based on excellent research and world-class expertise.
● Translation: The UK should promote a free and unencumbered flow of cybersecurity ideas from research to practical use and support approaches that have public benefits beyond their short term financial return. The unanticipated nature of future cyber threats means that a diverse set of cybersecurity ideas and approaches will be needed to build resilience and adaptivity. Many of the most valuable ideas will have broad security benefits for the public, beyond any direct financial returns.
Resumo:
A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.