837 resultados para Redes ad hoc veiculares (Redes de computadores)
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
Determining the key variables of transportation disadvantage remains a great challenge as the variables are commonly selected using ad-hoc techniques. In order to identify the variables, this research develops a transportation disadvantage framework by manipulating the capability approach. Developed framework is statistically analysed using partial least square-based software to determine the framework fitness. The statistical analysis identifies mobility and socioeconomic variables that significantly influence transportation disadvantage. The research reveals the key socioeconomic variables for transportation disadvantage in the case of Brisbane, Australia as household structure, presence of dependent family member, vehicle ownership, and driving licence possession.
Resumo:
This paper describes the development and experimental evaluation of a novel vision-based Autonomous Surface Vehicle with the purpose of performing coordinated docking manoeuvres with a target, such as an Autonomous Underwater Vehicle, on the water’s surface. The system architecture integrates two small processor units; the first performs vehicle control and implements a virtual force obstacle avoidance and docking strategy, with the second performing vision-based target segmentation and tracking. Furthermore, the architecture utilises wireless sensor network technology allowing the vehicle to be observed by, and even integrated within an ad-hoc sensor network. The system performance is demonstrated through real-world experiments.
Resumo:
The Kyoto Protocol is remarkable among global multilateral environmental agreements for its efforts to depoliticize compliance. However, attempts to create autonomous, arm’s length and rule-based compliance processes with extensive reliance on putatively neutral experts were only partially realized in practice in the first commitment period from 2008 to 2012. In particular, the procedurally constrained facilitative powers vested in the Facilitative Branch were circumvented, and expert review teams (ERTs) assumed pivotal roles in compliance facilitation. The ad hoc diplomatic and facilitative practices engaged in by these small teams of technical experts raise questions about the reliability and consistency of the compliance process. For the future operation of the Kyoto compliance system, it is suggested that ERTs should be confined to more technical and procedural roles, in line with their expertise. There would then be greater scope for the Facilitative Branch to assume a more comprehensive facilitative role, safeguarded by due process guarantees, in accordance with its mandate. However, if – as appears likely – the future compliance trajectories under the United Nations Framework Convention on Climate Change will include a significant role for ERTs without oversight by the Compliance Committee, it is important to develop appropriate procedural safeguards that reflect and shape the various technical and political roles these teams currently play.
Resumo:
The requirement of isolated relays is one of the prime obstacles in utilizing sequential slotted cooperative protocols for Vehicular Ad-hoc Networks (VANET). Significant research advancement has taken place to improve the diversity multiplexing trade-off (DMT) of cooperative protocols in conventional mobile networks without much attention on vehicular ad-hoc networks. We have extended the concept of sequential slotted amplify and forward (SAF) protocols in the context of urban vehicular ad-hoc networks. Multiple Input Multiple Output (MIMO) reception is used at relaying vehicular nodes to isolate the relays effectively. The proposed approach adds a pragmatic value to the sequential slotted cooperative protocols while achieving attractive performance gains in urban VANETs. We have analysed the DMT bounds and the outage probabilities of the proposed scheme. The results suggest that the proposed scheme can achieve an optimal DMT similar to the DMT upper bound of the sequential SAF. Furthermore, the outage performance of the proposed scheme outperforms the SAF protocol by 2.5 dB at a target outage probability of 10-4.
Resumo:
The notion of being sure that you have completely eradicated an invasive species is fanciful because of imperfect detection and persistent seed banks. Eradication is commonly declared either on an ad hoc basis, on notions of seed bank longevity, or on setting arbitrary thresholds of 1% or 5% confidence that the species is not present. Rather than declaring eradication at some arbitrary level of confidence, we take an economic approach in which we stop looking when the expected costs outweigh the expected benefits. We develop theory that determines the number of years of absent surveys required to minimize the net expected cost. Given detection of a species is imperfect, the optimal stopping time is a trade-off between the cost of continued surveying and the cost of escape and damage if eradication is declared too soon. A simple rule of thumb compares well to the exact optimal solution using stochastic dynamic programming. Application of the approach to the eradication programme of Helenium amarum reveals that the actual stopping time was a precautionary one given the ranges for each parameter. © 2006 Blackwell Publishing Ltd/CNRS.
Resumo:
Most standard algorithms for prediction with expert advice depend on a parameter called the learning rate. This learning rate needs to be large enough to fit the data well, but small enough to prevent overfitting. For the exponential weights algorithm, a sequence of prior work has established theoretical guarantees for higher and higher data-dependent tunings of the learning rate, which allow for increasingly aggressive learning. But in practice such theoretical tunings often still perform worse (as measured by their regret) than ad hoc tuning with an even higher learning rate. To close the gap between theory and practice we introduce an approach to learn the learning rate. Up to a factor that is at most (poly)logarithmic in the number of experts and the inverse of the learning rate, our method performs as well as if we would know the empirically best learning rate from a large range that includes both conservative small values and values that are much higher than those for which formal guarantees were previously available. Our method employs a grid of learning rates, yet runs in linear time regardless of the size of the grid.
Resumo:
In some delay-tolerant communication systems such as vehicular ad-hoc networks, information flow can be represented as an infectious process, where each entity having already received the information will try to share it with its neighbours. The random walk and random waypoint models are popular analysis tools for these epidemic broadcasts, and represent two types of random mobility. In this paper, we introduce a simulation framework investigating the impact of a gradual increase of bias in path selection (i.e. reduction of randomness), when moving from the former to the latter. Randomness in path selection can significantly alter the system performances, in both regular and irregular network structures. The implications of these results for real systems are discussed in details.
Resumo:
It’s commonly assumed that psychiatric violence is motivated by delusions, but here the concept of a reversed impetus is explored, to understand whether delusions are formed as ad-hoc or post-hoc rationalizations of behaviour or in advance of the actus reus. The reflexive violence model proposes that perceptual stimuli has motivational power and this may trigger unwanted actions and hallucinations. The model is based on the theory of ecological perception, where opportunities enabled by an object are cues to act. As an apple triggers a desire to eat, a gun triggers a desire to shoot. These affordances (as they are called) are part of the perceptual apparatus, they allow the direct recognition of objects – and in emergencies they enable the fastest possible reactions. Even under normal circumstances, the presence of a weapon will trigger inhibited violent impulses. The presence of a victim will also, but under normal circumstances, these affordances don’t become violent because negative action impulses are totally inhibited, whereas in psychotic illness, negative action impulses are treated as emergencies and bypass frontal inhibitory circuits. What would have been object recognition becomes a blind automatic action. A range of mental illnesses can cause inhibition to be bypassed. At its most innocuous, this causes both simple hallucinations (where the motivational power of an object is misattributed). But ecological perception may have the power to trigger serious violence also –a kind that’s devoid of motives or planning and is often shrouded in amnesia or post-rational delusions.
Resumo:
The proliferation of the web presents an unsolved problem of automatically analyzing billions of pages of natural language. We introduce a scalable algorithm that clusters hundreds of millions of web pages into hundreds of thousands of clusters. It does this on a single mid-range machine using efficient algorithms and compressed document representations. It is applied to two web-scale crawls covering tens of terabytes. ClueWeb09 and ClueWeb12 contain 500 and 733 million web pages and were clustered into 500,000 to 700,000 clusters. To the best of our knowledge, such fine grained clustering has not been previously demonstrated. Previous approaches clustered a sample that limits the maximum number of discoverable clusters. The proposed EM-tree algorithm uses the entire collection in clustering and produces several orders of magnitude more clusters than the existing algorithms. Fine grained clustering is necessary for meaningful clustering in massive collections where the number of distinct topics grows linearly with collection size. These fine-grained clusters show an improved cluster quality when assessed with two novel evaluations using ad hoc search relevance judgments and spam classifications for external validation. These evaluations solve the problem of assessing the quality of clusters where categorical labeling is unavailable and unfeasible.
Resumo:
‘Complexity’ is a term that is increasingly prevalent in conversations about building capacity for 21st Century professional engineers. Society is grappling with the urgent and challenging reality of accommodating seven billion people, meeting needs and innovating lifestyle improvements in ways that do not destroy atmospheric, biological and oceanic systems critical to life. Over the last two decades in particular, engineering educators have been active in attempting to build capacity amongst professionals to deliver ‘sustainable development’ in this rapidly changing global context. However curriculum literature clearly points to a lack of significant progress, with efforts best described as ad hoc and highly varied. Given the limited timeframes for action to curb environmental degradation proposed by scientists and intergovernmental agencies, the authors of this paper propose it is imperative that curriculum renewal towards education for sustainable development proceeds rapidly, systemically, and in a transformational manner. Within this context, the paper discusses the need to consider a multiple track approach to building capacity for 21st Century engineering, including priorities and timeframes for undergraduate and postgraduate curriculum renewal. The paper begins with a contextual discussion of the term complexity and how it relates to life in the 21st Century. The authors then present a whole of system approach for planning and implementing rapid curriculum renewal that addresses the critical roles of several generations of engineering professionals over the next three decades. The paper concludes with observations regarding engaging with this approach in the context of emerging accreditation requirements and existing curriculum renewal frameworks.
Resumo:
Background Foot dorsiflexion plays an essential role in both controlling balance and human gait. Electromyography (EMG) and sonomyography (SMG) can provide information on several aspects of muscle function. The aim was to establish the relationship between the EMG and SMG variables during isotonic contractions of foot dorsiflexors. Methods Twenty-seven healthy young adults performed the foot dorsiflexion test on a device designed ad hoc. EMG variables were maximum peak and area under the curve. Muscular architecture variables were muscle thickness and pennation angle. Descriptive statistical analysis, inferential analysis and a multivariate linear regression model were carried out. The confidence level was established with a statistically significant p-value of less than 0.05. Results The correlation between EMG variables and SMG variables was r = 0.462 (p < 0.05). The linear regression model to the dependent variable “peak normalized tibialis anterior (TA)” from the independent variables “pennation angle and thickness”, was significant (p = 0.002) with an explained variance of R2 = 0.693 and SEE = 0.16. Conclusions There is a significant relationship and degree of contribution between EMG and SMG variables during isotonic contractions of the TA muscle. Our results suggest that EMG and SMG can be feasible tools for monitoring and assessment of foot dorsiflexors. TA muscle parameterization and assessment is relevant in order to know that increased strength accelerates the recovery of lower limb injuries.
Resumo:
For the first decade of its existence, the concept of citizen journalism has described an approach which was seen as a broadening of the participant base in journalistic processes, but still involved only a comparatively small subset of overall society – for the most part, citizen journalists were news enthusiasts and “political junkies” (Coleman, 2006) who, as some exasperated professional journalists put it, “wouldn’t get a job at a real newspaper” (The Australian, 2007), but nonetheless followed many of the same journalistic principles. The investment – if not of money, then at least of time and effort – involved in setting up a blog or participating in a citizen journalism Website remained substantial enough to prevent the majority of Internet users from engaging in citizen journalist activities to any significant extent; what emerged in the form of news blogs and citizen journalism sites was a new online elite which for some time challenged the hegemony of the existing journalistic elite, but gradually also merged with it. The mass adoption of next-generation social media platforms such as Facebook and Twitter, however, has led to the emergence of a new wave of quasi-journalistic user activities which now much more closely resemble the “random acts of journalism” which JD Lasica envisaged in 2003. Social media are not exclusively or even predominantly used for citizen journalism; instead, citizen journalism is now simply a by-product of user communities engaging in exchanges about the topics which interest them, or tracking emerging stories and events as they happen. Such platforms – and especially Twitter with its system of ad hoc hashtags that enable the rapid exchange of information about issues of interest – provide spaces for users to come together to “work the story” through a process of collaborative gatewatching (Bruns, 2005), content curation, and information evaluation which takes place in real time and brings together everyday users, domain experts, journalists, and potentially even the subjects of the story themselves. Compared to the spaces of news blogs and citizen journalism sites, but also of conventional online news Websites, which are controlled by their respective operators and inherently position user engagement as a secondary activity to content publication, these social media spaces are centred around user interaction, providing a third-party space in which everyday as well as institutional users, laypeople as well as experts converge without being able to control the exchange. Drawing on a number of recent examples, this article will argue that this results in a new dynamic of interaction and enables the emergence of a more broadly-based, decentralised, second wave of citizen engagement in journalistic processes.
Resumo:
The cognitive benefits of biophilia have been studied quite extensively, dating as far back as the 1980s, while studies into economic benefits are still in their infancy. Recent research has attempted to quantify a number of economic returns on biophilic elements; however knowledge in this field is still ad hoc and highly variable. Many studies acknowledge difficulties in discerning information such as certain social and aesthetic benefits. While conceptual understanding of the physiological and psychological effects of exposure to nature is widely recognised and understood, this has not yet been systematically translated into monetary terms. It is clear from the literature that further research is needed to both obtain data on the economics of biophilic urbanism, and to create the business case for biophilic urbanism. With this in mind, this paper will briefly highlight biophilic urbanism referencing previous work in the field. It will then explore a number of emergent gaps in the measurable economic understanding of these elements and suggest opportunities for engaging decision makers in the business case for biophilic urbanism. The paper concludes with recommendations for moving forward through targeted research and economic analysis.
Resumo:
In 2007 the National Framework for Energy Efficiency provided funding for the first survey of energy efficiency education across all Australian universities teaching engineering education. The survey asked the question, ‘What is the state of education for energy efficiency in Australian engineering education?’. There was an excellent response to the survey, with 48 course responses from lecturers across 27 universities from every state and territory in Australia, and 260 student responses from 18 courses across 8 universities from all 6 states. It is concluded from the survey findings that the state of education for energy efficiency in Australian engineering education is currently highly variable and ad hoc across universities and engineering disciplines.