962 resultados para internet computing
Resumo:
The project is a qualitative and ethnographic study investigating how the internet is used by migrants from cultural and linguistically diverse (CALD) groups. It examines how internet use assists the re- settlement process in Brisbane, Australia. The project aims to support strategies and initiatives in the successful re-settlement of migrants from CALD groups into urban localities. It has 2 main foci: • To find out how and what type of online information and services are used by migrants and what are the barriers to accessing the information. Information and resources about transport, housing, health, social services, education and other information is essential for successful re-settlement. • In what ways is the internet is used as a social medium, to communicate with friends and family in homeland countries and connect with people in local regions, in ways that might help to combat social isolation.
Resumo:
This research introduces a general methodology in order to create a Coloured Petri Net (CPN) model of a security protocol. Then standard or user-defined security properties of the created CPN model are identified. After adding an attacker model to the protocol model, the security property is verified using state space method. This approach is applied to analyse a number of trusted computing protocols. The results show the applicability of proposed method to analyse both standard and user-defined properties.
Resumo:
New technologies, in particular the Internet, have transformed journalistic practices in many ways around the world. While a number of studies have investigated how established journalists are dealing with and using new technologies in a number of countries, very little attention has been paid to how student journalists view and use the Internet as a source of news. This study examined the ways in which second and third-year journalism and arts students at the University of Queensland (Australia) get their news, how they use the Internet as a news channel, as well as their perceptions and use of other new technologies. The authors draw on the theoretical frameworks of uses and gratifications, as well as the media richness theory to explore the primary reasons why students use and perceive the Internet as a news channel.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.
Resumo:
This PhD research investigates the critical resources and Internet capabilities utilized by firms for leveraging global performance in entrepreneurial firms. Firm resources have been identified as important firm assets, which contribute to the firm's competitive global position. The Internet is a critical resource for a new generation of small and medium sized enterprise (SME) in pursuing international opportunities. By facilitating international business, the Internet has the ability to increase the quality and speed of communications, lower transaction costs, and facilitate the development of networks. Despite the increasing numbers of firms utilizing the Internet to pursue international opportunities, limited research remains. Adopting multiple case study methodology and structural equation modelling, the research identified the firm-level resources, which coincide with capabilities in a model predicting how international performance in firms is achieved.
Resumo:
For clinical use, in electrocardiogram (ECG) signal analysis it is important to detect not only the centre of the P wave, the QRS complex and the T wave, but also the time intervals, such as the ST segment. Much research focused entirely on qrs complex detection, via methods such as wavelet transforms, spline fitting and neural networks. However, drawbacks include the false classification of a severe noise spike as a QRS complex, possibly requiring manual editing, or the omission of information contained in other regions of the ECG signal. While some attempts were made to develop algorithms to detect additional signal characteristics, such as P and T waves, the reported success rates are subject to change from person-to-person and beat-to-beat. To address this variability we propose the use of Markov-chain Monte Carlo statistical modelling to extract the key features of an ECG signal and we report on a feasibility study to investigate the utility of the approach. The modelling approach is examined with reference to a realistic computer generated ECG signal, where details such as wave morphology and noise levels are variable.
Resumo:
A modular, graphic-oriented Internet browser has been developed to enable non-technical client access to a literal spinning world of information and remotely sensed. The Earth Portal (www.earthportal.net) uses the ManyOne browser (www.manyone.net) to provide engaging point and click views of the Earth fully tessellated with remotely sensed imagery and geospatial data. The ManyOne browser technology use Mozilla with embedded plugins to apply multiple 3-D graphics engines, e.g. ArcGlobe or GeoFusion, that directly link with the open-systems architecture of the geo-spatial infrastructure. This innovation allows for rendering of satellite imagery directly over the Earth's surface and requires no technical training by the web user. Effective use of this global distribution system for the remote sensing community requires a minimal compliance with protocols and standards that have been promoted by NSDI and other open-systems standards organizations.
Resumo:
The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc. The Remote Sensing Core Curriculum (RSCC) was initiated in 1993 to meet the demands for a college-level set of resources to enhance the quality of education across national and international campuses. The American Society of Photogrammetry and Remote Sensing adopted the RSCC in 1996 to sustain support of this educational initiative for its membership and collegiate community. A series of volumes, containing lectures, exercises, and data, is being created by expert contributors to address the different technical fields of remote sensing. The RSCC program is designed to operate on the Internet taking full advantage of the World Wide Web (WWW) technology for distance learning. The issues of curriculum development related to the educational setting, with demands on faculty, students, and facilities, is considered to understand the new paradigms for WWW-influenced computer-aided learning. The WWW is shown to be especially appropriate for facilitating remote sensing education with requirements for addressing image data sets and multimedia learning tools. The RSCC is located at http://www.umbc.edu/rscc.
Resumo:
Enterprises, both public and private, have rapidly commenced using the benefits of enterprise resource planning (ERP) combined with business analytics and “open data sets” which are often outside the control of the enterprise to gain further efficiencies, build new service operations and increase business activity. In many cases, these business activities are based around relevant software systems hosted in a “cloud computing” environment. “Garbage in, garbage out”, or “GIGO”, is a term long used to describe problems in unqualified dependency on information systems, dating from the 1960s. However, a more pertinent variation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems, such as ERP and usage of open datasets in a cloud environment, the ability to verify the authenticity of those data sets used may be almost impossible, resulting in dependence upon questionable results. Illicit data set “impersonation” becomes a reality. At the same time the ability to audit such results may be an important requirement, particularly in the public sector. This paper discusses the need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment and analyses some current technologies that are offered and which may be appropriate. However, severe limitations to addressing these requirements have been identified and the paper proposes further research work in the area.
Resumo:
This special issue of Networking Science focuses on Next Generation Network (NGN) that enables the deployment of access independent services over converged fixed and mobile networks. NGN is a packet-based network and uses the Internet protocol (IP) to transport the various types of traffic (voice, video, data and signalling). NGN facilitates easy adoption of distributed computing applications by providing high speed connectivity in a converged networked environment. It also makes end user devices and applications highly intelligent and efficient by empowering them with programmability and remote configuration options. However, there are a number of important challenges in provisioning next generation network technologies in a converged communication environment. Some preliminary challenges include those that relate to QoS, switching and routing, management and control, and security which must be addressed on an urgent or emergency basis. The consideration of architectural issues in the design and pro- vision of secure services for NGN deserves special attention and hence is the main theme of this special issue.
Resumo:
Phishing, a form of on-line identity theft, is a major problem worldwide, accounting for more than $7.5 Billion in losses in the US alone between 2005 and 2008. Australia was the first country to be targeted by Internet bank phishing in 2003 and continues to have a significant problem in this area. The major cyber crime groups responsible for phishing are based in Eastern Europe. They operate with a large degree of freedom due to the inherent difficulties in cross border law enforcement and the current situation in Eastern Europe, particularly in Russia and the Ukraine. They employ highly sophisticated and efficient technical tools to compromise victims and subvert bank authentication systems. However because it is difficult for them to repatriate the fraudulently obtained funds directly they employ Internet money mules in Australia to transfer the money via Western Union or Money gram. It is proposed a strategy, which firstly places more focus by Australian law enforcement upon transactions via Western Union and Money gram to detect this money laundering, would significantly impact the success of the Phishing attack model. This combined with a technical monitoring of Trojan technology and education of potential Internet money mules to avoid being duped would provide a winning strategy for the war on phishing for Australia.
Resumo:
Recently, botnet, a network of compromised computers, has been recognized as the biggest threat to the Internet. The bots in a botnet communicate with the botnet owner via a communication channel called Command and Control (C & C) channel. There are three main C & C channels: Internet Relay Chat (IRC), Peer-to-Peer (P2P) and web-based protocols. By exploiting the flexibility of the Web 2.0 technology, the web-based botnet has reached a new level of sophistication. In August 2009, such botnet was found on Twitter, one of the most popular Web 2.0 services. In this paper, we will describe a new type of botnet that uses Web 2.0 service as a C & C channel and a temporary storage for their stolen information. We will then propose a novel approach to thwart this type of attack. Our method applies a unique identifier of the computer, an encryption algorithm with session keys and a CAPTCHA verification.
Resumo:
The concept of cloud computing services is appealing to the small and medium enterprises (SMEs), with the opportunity to acquire modern information technology resources as a utility and avoid costly capital investments in technology resources. However, the adoption of the cloud computing services presents significant challenges to the SMEs. The SMEs need to determine a path to adopting the cloud computing services that would ensure their sustainable presence in the cloud computing environment. Information about approaches to adopting the cloud computing services by the SMEs is fragmented. Through an interpretive design, we suggest that the SMEs need to have a strategic and incremental intent, understand their organizational structure, understand the external factors, consider the human resource capacity, and understand the value expectations from the cloud computing services to forge a successful path to adopting the cloud computing services. These factors would contribute to a model of cloud services for SMEs.
Resumo:
The placement of the mappers and reducers on the machines directly affects the performance and cost of the MapReduce computation in cloud computing. From the computational point of view, the mappers/reducers placement problem is a generalization of the classical bin packing problem, which is NP-complete. Thus, in this paper we propose a new heuristic algorithm for the mappers/reducers placement problem in cloud computing and evaluate it by comparing with other several heuristics on solution quality and computation time by solving a set of test problems with various characteristics. The computational results show that our heuristic algorithm is much more efficient than the other heuristics. Also, we verify the effectiveness of our heuristic algorithm by comparing the mapper/reducer placement for a benchmark problem generated by our heuristic algorithm with a conventional mapper/reducer placement. The comparison results show that the computation using our mapper/reducer placement is much cheaper while still satisfying the computation deadline.