257 resultados para Worlds Fastest Computer
Resumo:
This paper describes the development of an analytical model used to simulate the fatigue behaviour of roof cladding during the passage of a tropical cyclone. The model incorporated into a computer program uses wind pressure data from wind tunnel tests in combination with time history information on wind speed and direction during a tropical cyclone, and experimental fatigue characteristics data of roof claddings. The wind pressure data is analysed using a rainflow form of analysis, and a fatigue damage index calculated using a modified form of Miner's rule. Some of the results obtained to date and their significance in relation to the review of current fatigue tests are presented. The model appears to be reasonable for comparative estimation of fatigue life, but an improvement of Miner's rule is required for the prediction of actual fatigue life.
Resumo:
Microvessel density (MVD) is a widely used surrogate measure of angiogenesis in pathological specimens and tumour models. Measurement of MVD can be achieved by several methods. Automation of counting methods aims to increase the speed, reliability and reproducibility of these techniques. The image analysis system described here enables MVD measurement to be carried out with minimal expense in any reasonably equipped pathology department or laboratory. It is demonstrated that the system translates easily between tumour types which are suitably stained with minimal calibration. The aim of this paper is to offer this technique to a wider field of researchers in angiogenesis.
Resumo:
This paper offers a discussion on the “mundane” or quotidian aspects of that software which might at first glance seem to be a fine example of the extraordinary. It looks at game worlds in terms of an ancient human desire to articulate place in the world and pursues a design concept which resonates with this practice in order to enable a more mundane exploitation of such spatial representations: the claiming of place.
Resumo:
Many games now on the market come with a Software Development Kit, or SDK, which allow players to construct their own worlds and mod(ify) the original. One or two of these mods have achieved notoriety in the press, cited as evidence of malicious intent on the part of the modders who often exploit their own known lived experience as a basis for new virtual playgrounds. But most player constructed games are a source of delight and pleasure for the builder and for the community of players. Creating a game is the act of creating a world, of making a place.
Resumo:
Drawing on data from the Australian Business Assessment of Computer User Security (ABACUS) survey, this paper examines a range of factors that may influence businesses’ likelihood of being victimised by a computer security incident. It has been suggested that factors including business size, industry sector, level of outsourcing, expenditure on computer security functions and types of computer security tools and/or policies used may influence the probability of particular businesses experiencing such incidents. This paper uses probability modelling to test whether this is the case for the 4,000 businesses that responded to the ABACUS survey. It was found that the industry sector that a business belonged to, and business expenditure on computer security, were not related to businesses’ likelihood of detecting computer security incidents. Instead, the number of employees that a business has and whether computer security functions were outsourced were found to be key indicators of businesses’ likelihood of detecting incidents. Some of the implications of these findings are considered in this paper.
Resumo:
The relationship between coronal knee laxity and the restraining properties of the collateral ligaments remains unknown. This study investigated correlations between the structural properties of the collateral ligaments and stress angles used in computer-assisted total knee arthroplasty (TKA), measured with an optically based navigation system. Ten fresh-frozen cadaveric knees (mean age: 81 ± 11 years) were dissected to leave the menisci, cruciate ligaments, posterior joint capsule and collateral ligaments. The resected femur and tibia were rigidly secured within a test system which permitted kinematic registration of the knee using a commercially available image-free navigation system. Frontal plane knee alignment and varus-valgus stress angles were acquired. The force applied during varus-valgus testing was quantified. Medial and lateral bone-collateral ligament-bone specimens were then prepared, mounted within a uni-axial materials testing machine, and extended to failure. Force and displacement data were used to calculate the principal structural properties of the ligaments. The mean varus laxity was 4 ± 1° and the mean valgus laxity was 4 ± 2°. The corresponding mean manual force applied was 10 ± 3 N and 11 ± 4 N, respectively. While measures of knee laxity were independent of the ultimate tensile strength and stiffness of the collateral ligaments, there was a significant correlation between the force applied during stress testing and the instantaneous stiffness of the medial (r = 0.91, p = 0.001) and lateral (r = 0.68, p = 0.04) collateral ligaments. These findings suggest that clinicians may perceive a rate of change of ligament stiffness as the end-point during assessment of collateral knee laxity.
Resumo:
Novel computer vision techniques have been developed for automatic monitoring of crowed environments such as airports, railway stations and shopping malls. Using video feeds from multiple cameras, the techniques enable crowd counting, crowd flow monitoring, queue monitoring and abnormal event detection. The outcome of the research is useful for surveillance applications and for obtaining operational metrics to improve business efficiency.
Resumo:
Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd. The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship. The Copyright Act 1968 (Cth) does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998), the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.
Resumo:
Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.
Resumo:
There are a number of pressing issues facing contemporary online environments that are causing disputes among participants and platform operators and increasing the likelihood of external regulation. A number of solutions have been proposed, including industry self-governance, top-down regulation and emergent self-governance such as EVE Online’s “Council of Stellar Management”. However, none of these solutions seem entirely satisfying; facing challenges from developers who fear regulators will not understand their platforms, or players who feel they are not sufficiently empowered to influence the platform, while many authors have raised concerns over the implementation of top-down regulation, and why the industry may be well-served to pre-empt such action. This paper considers case studies of EVE Online and the offshore gambling industry, and whether a version of self-governance may be suitable for the future of the industry.
Resumo:
The movement of exotic biota into native ecosystems are central to debates about the acclimatisation of plants in the settler colonies of the nineteenth century. For example, plants like lucerne from Europe and sudan grass from South Africa were transferred to Australia to support pastoral economies. The saltbush Atriplex spp. is an anomaly-it too, eventually, became the subject of acclimatisation within its native Australia because it was also deemed useful to the pastoralists of arid and semi-arid New South Wales. When settlers first came to this part of Australia, however, initial perceptions were that the plants were useless. We trace this transformation from the desert 'desperation' plant during early settlement to the 'precious' conservation species, from the 1880s, when there were changes in both management strategies and cultural responses to saltbush in Australia. This reconsideration can be seen in scientific assessments and experiments, in the way that it was commoditised by seeds and nursery traders, and in its use as a metaphor in bush poetry to connote a gendered nationalist figure in Saltbush Bill. We argue that while initial settlers were often so optimistic about European management techniques, they had nothing but contempt for indigenous plants. The later impulses to the conservation of natives arose from experiences of bitter failure and despair over attempts to impose European methods, which in turn forced this re-evaluation of Australian species.
Resumo:
Fire incident in buildings is common, so the fire safety design of the framed structure is imperative, especially for the unprotected or partly protected bare steel frames. However, software for structural fire analysis is not widely available. As a result, the performance-based structural fire design is urged on the basis of using user-friendly and conventional nonlinear computer analysis programs so that engineers do not need to acquire new structural analysis software for structural fire analysis and design. The tool is desired to have the capacity of simulating the different fire scenarios and associated detrimental effects efficiently, which includes second-order P-D and P-d effects and material yielding. Also the nonlinear behaviour of large-scale structure becomes complicated when under fire, and thus its simulation relies on an efficient and effective numerical analysis to cope with intricate nonlinear effects due to fire. To this end, the present fire study utilizes a second order elastic/plastic analysis software NIDA to predict structural behaviour of bare steel framed structures at elevated temperatures. This fire study considers thermal expansion and material degradation due to heating. Degradation of material strength with increasing temperature is included by a set of temperature-stress-strain curves according to BS5950 Part 8 mainly, which implicitly allows for creep deformation. This finite element stiffness formulation of beam-column elements is derived from the fifth-order PEP element which facilitates the computer modeling by one member per element. The Newton-Raphson method is used in the nonlinear solution procedure in order to trace the nonlinear equilibrium path at specified elevated temperatures. Several numerical and experimental verifications of framed structures are presented and compared against solutions in literature. The proposed method permits engineers to adopt the performance-based structural fire analysis and design using typical second-order nonlinear structural analysis software.
Resumo:
Pile foundations transfer loads from superstructures to stronger sub soil. Their strength and stability can hence affect structural safety. This paper treats the response of reinforced concrete pile in saturated sand to a buried explosion. Fully coupled computer simulation techniques are used together with five different material models. Influence of reinforcement on pile response is investigated and important safety parameters of horizontal deformations and tensile stresses in the pile are evaluated. Results indicate that adequate longitudinal reinforcement and proper detailing of transverse reinforcement can reduce pile damage. Present findings can serve as a benchmark reference for future analysis and design.
Resumo:
This chapter draws together the key themes and perspectives from the chapters and offers a critique of the theoretical reframing - underpinned by children's rights and child agency - that has informed the book. Additionally, the documented research is situated within broader international contexts of ECE research, thus offering insights that can inform the field more generally. This is a forward looking discussion of current research that offers clear directions for ECEfS and future research in this field.
Resumo:
This thesis developed a method for real-time and handheld 3D temperature mapping using a combination of off-the-shelf devices and efficient computer algorithms. It contributes a new sensing and data processing framework to the science of 3D thermography, unlocking its potential for application areas such as building energy auditing and industrial monitoring. New techniques for the precise calibration of multi-sensor configurations were developed, along with several algorithms that ensure both accurate and comprehensive surface temperature estimates can be made for rich 3D models as they are generated by a non-expert user.