470 resultados para Software architecture document
Resumo:
Background Adolescent Idiopathic Scoliosis is the most common type of spinal deformity whose aetiology remains unclear. Studies suggest that gravitational forces in the standing position play an important role in scoliosis progression, therefore anthropometric data are required to develop biomechanical models of the deformity. Few studies have analysed the trunk by vertebral level and none have performed investigations of the scoliotic trunk. The aim of this study was to determine the centroid, thickness, volume and estimated mass, for sections of the trunk in Adolescent Idiopathic Scoliosis patients. Methods Existing low-dose Computed Tomography scans were used to estimate vertebral level-by-level torso masses for 20 female Adolescent Idiopathic Scoliosis patients. ImageJ processing software was used to analyse the Computed Tomography images and enable estimation of the segmental torso mass corresponding to each vertebral level. Findings The patients’ mean age was 15.0 (SD 2.7) years with mean major Cobb Angle of 52° (SD 5.9) and mean patient weight of 58.2 (SD 11.6) kg. The magnitude of torso segment mass corresponding to each vertebral level increased by 150% from 0.6kg at T1 to 1.5kg at L5. Similarly, the segmental thickness corresponding to each vertebral level from T1-L5 increased inferiorly from a mean 18.5 (SD 2.2) mm at T1 to 32.8 (SD 3.4) mm at L5. The mean total trunk mass, as a percentage of total body mass, was 27.8 (SD 0.5) % which was close to values reported in previous literature. Interpretation This study provides new anthropometric reference data on segmental (vertebral level-by-level) torso mass in Adolescent Idiopathic Scoliosis patients, useful for biomechanical models of scoliosis progression and treatment.
Resumo:
Globalization, financial deregulation, economic turmoil, and technology breakthroughs are profoundly exposing organizations to business networks. Engaging these networks requires explicit planning from the strategic level down to the operational level of an organization, which significantly affects organizational artefacts such as business services, processes, and resources. Although enterprise architecture (EA) aligns business and IT aspects of organizational systems, previous applications of EA have not comprehensively addressed a methodological framework for planning. In the context of business networks, this study seeks to explore the application of EA for business network planning where it builds upon relevant and well-established prescriptive and descriptive aspects of EA. Prescriptive aspects include integrated models of services, business processes, and resources among other organizational artefacts, at both business and IT levels. Descriptive aspects include ontological classifications of business functionality, which allow EA models to be aligned semantically to organizational artefacts and, ultimately higher-level business strategy. A prominent approach for capturing descriptive aspects of EA is business capability modelling. In order to explore and develop the illustrative extensions of EA through capability modelling, a list of requirements (capability dimensions) for business network planning will be identified and validated through a revelatory case study encompassing different business network manifestations, or situations. These include virtual organization, liquid workforce, business network orchestration, and headquarters-subsidiary. The use of artefacts, conventionally, modelled through EA will be considered in these network situations. Two general considerations for EA extensions are explored for the identified requirements at the level of the network: extension of artefacts through the network and alignment of network level artefacts with individual organization artefacts. The list of requirements provides the basis for a constructivist extension of EA in the following ways. Firstly, for descriptive aspects, it offers constructivist insights to guide extensions for particular EA techniques and concepts. Secondly, for prescriptive aspects it defines a set of capability dimensions, which improve the analysis and assessment of organization capabilities for business network situations.
Resumo:
language (such as C++ and Java). The model used allows to insert watermarks on three “orthogonal” levels. For the first level, watermarks are injected into objects. The second level watermarking is used to select proper variants of the source code. The third level uses transition function that can be used to generate copies with different functionalities. Generic watermarking schemes were presented and their security discussed.
Resumo:
Precise clock synchronization is essential in emerging time-critical distributed control systems operating over computer networks where the clock synchronization requirements are mostly focused on relative clock synchronization and high synchronization precision. Existing clock synchronization techniques such as the Network Time Protocol (NTP) and the IEEE 1588 standard can be difficult to apply to such systems because of the highly precise hardware clocks required, due to network congestion caused by a high frequency of synchronization message transmissions, and high overheads. In response, we present a Time Stamp Counter based precise Relative Clock Synchronization Protocol (TSC-RCSP) for distributed control applications operating over local-area networks (LANs). In our protocol a software clock based on the TSC register, counting CPU cycles, is adopted in the time clients and server. TSC-based clocks offer clients a precise, stable and low-cost clock synchronization solution. Experimental results show that clock precision in the order of 10~microseconds can be achieved in small-scale LAN systems. Such clock precision is much higher than that of a processor's Time-Of-Day clock, and is easily sufficient for most distributed real-time control applications over LANs.
Resumo:
During the early design stages of construction projects, accurate and timely cost feedback is critical to design decision making. This is particularly challenging for cost estimators, as they must quickly and accurately estimate the cost of the building when the design is still incomplete and evolving. State-of-the-art software tools typically use a rule-based approach to generate detailed quantities from the design details present in a building model and relate them to the cost items in a cost estimating database. In this paper, we propose a generic approach for creating and maintaining a cost estimate using flexible mappings between a building model and a cost estimate. The approach uses queries on the building design that are used to populate views, and each view is then associated with one or more cost items. The benefit of this approach is that the flexibility of modern query languages allows the estimator to encode a broad variety of relationships between the design and estimate. It also avoids the use of a common standard to which both designers and estimators must conform, allowing the estimator added flexibility and functionality to their work.
Resumo:
This article presents a study of how humans perceive and judge the relevance of documents. Humans are adept at making reasonably robust and quick decisions about what information is relevant to them, despite the ever increasing complexity and volume of their surrounding information environment. The literature on document relevance has identified various dimensions of relevance (e.g., topicality, novelty, etc.), however little is understood about how these dimensions may interact. We performed a crowdsourced study of how human subjects judge two relevance dimensions in relation to document snippets retrieved from an internet search engine. The order of the judgment was controlled. For those judgments exhibiting an order effect, a q–test was performed to determine whether the order effects can be explained by a quantum decision model based on incompatible decision perspectives. Some evidence of incompatibility was found which suggests incompatible decision perspectives is appropriate for explaining interacting dimensions of relevance in such instances.
Resumo:
Internet and its widespread usage for multimedia document distribution put the copyright issue in a complete new setting. Multimedia documents, specifically those installed on a web page, are no longer passive as they typically include active applets. Copyright protection safeguards the intellectual property (IP) of multimedia documents, which are either sold or distributed free of charge. In this Chapter, the basic tools for copyright protection are discussed. First, general concepts and the vocabulary used in copyright protection of multimedia documents are discussed. Later, taxonomy of watermarking and fingerprinting techniques are studied. This part is concluded by a review of the literature dealing with IP security. The main part of the chapter discusses the generic watermarking scheme and illustrates it on three specific examples: collusion-free watermarking, spread spectrum watermarking, and software fingerprinting. Future trends and conclusions close the chapter.
Resumo:
In 2006, Gaurav Gupta and Josef Pieprzyk presented an attack on the branch-based software watermarking scheme proposed by Ginger Myles and Hongxia Jin in 2005. The software watermarking model is based on replacing jump instructions or unconditional branch statements (UBS) by calls to a fingerprint branch function (FBF) that computes the correct target address of the UBS as a function of the generated fingerprint and integrity check. If the program is tampered with, the fingerprint and/or integrity checks change and the target address is not computed correctly. Gupta and Pieprzyk's attack uses debugger capabilities such as register and address lookup and breakpoints to minimize the requirement to manually inspect the software. Using these resources, the FBF and calls to the same is identified, correct displacement values are generated and calls to FBF are replaced by the original UBS transferring control of the attack to the correct target instruction. In this paper, we propose a watermarking model that provides security against such debugging attacks. Two primary measures taken are shifting the stack pointer modification operation from the FBF to the individual UBSs, and coding the stack pointer modification in the same language as that of the rest of the code rather than assembly language to avoid conspicuous contents. The manual component complexity increases from O(1) in the previous scheme to O(n) in our proposed scheme.
Resumo:
We no longer have the luxury of time as the effects of climate change are being felt, according to the latest Intergovernmental Panel on Climate Change report, on every continent and in every ocean. More than 50% of the population of the United States and 85% of Australians live in coastal regions. The number of people living in the world’s coastal regions is expected to increase along with the need to improve capacity to mitigate hazards , and manage the multiple risks that have been identified by the scientific community. Under the auspices of the Association of Collegiate Schools of Architecture (ACSA) design academics and practitioners from the Americas, Asia, and Australia met in Fort Lauderdale, Florida for the fourth Subtropical Cities international conference to share outcomes of research and new pedagogies to address the critical transformation of the physical environments and infrastructures of the world’s vulnerable coastal communities. The theme of Subtropical Cities, adopted by the ACSA for its Fall 2014 Conference, is not confined entirely to a latitudinal or climatic frame of reference. The paper and project presentations addressed a range of theoretical, practice-led, and education-oriented research topics in architecture and urban design related to the subtropics, with emphasis on urban and coastal regions. More than half the papers originate from universities and practices in coastal regions. Threads emerged from a tapestry of localized investigations to reveal a more global understanding about possible futures we are designing for current and future generations. The one hundred-plus conference delegates and presenters represented 33 universities and institutions from across the United States, Mexico, Canada, Australia, the Middle East, Peru and China. Case studies from India, Morocco, Tahiti, Indonesia, Jordan, and Cambodia were also presented, expanding the global knowledge base. Co-authored submissions presented new directions for architecture and design, with a resounding theme of collaboration across diverse disciplines. The ability to deal with abstraction and complexity, and the capacity to develop synthesis and frameworks for defining problem boundaries can be considered key attributes of architectural thinking. Such a unique set of abilities can forge collaboration with different professional disciplines to achieve extraordinary outcomes. As the broad range of papers presented at this conference suggest, existing architectural and urban typologies and practices are increasingly considered part of the cause and not the solution to adapting to climate change and sea level rise. Design responses and the actions needed to generate new and unfamiliar forms of urbanism and infrastructure for defense, adaptation, and retreat in subtropical urban regions are being actively explored in academic design studios and research projects around the world. Many presentations propose provocative and experimental strategies as global climate moves beyond our “comfort zone”. The ideas presented at the Subtropical Cities conference are timely as options for low-energy passive climatic design are becoming increasingly limited in the context of changing climate. At the same time, ways of reducing or obsoleting energy intensive mechanical systems in densely populated urban centres present additional challenges for designers and communities as a whole. The conference was marked by a common theme of trans-disciplinary research, where design integration with emerging technologies resonate with a reaffirmation of the centrality of design thinking, expanding the scope of the traditional architecture studio pedagogy to integrate knowledge from other disciplines and the participation of diverse communities.
Resumo:
Design Science is the process of solving ‘wicked problems’ through designing, developing, instantiating, and evaluating novel solutions (Hevner, March, Park and Ram, 2004). Wicked problems are described as agent finitude in combination with problem complexity and normative constraint (Farrell and Hooker, 2013). In Information Systems Design Science, determining that problems are ‘wicked’ differentiates Design Science research from Solutions Engineering (Winter, 2008) and is a necessary part of proving the relevance to Information Systems Design Science research (Hevner, 2007; Iivari, 2007). Problem complexity is characterised as many problem components with nested, dependent and co-dependent relationships interacting through multiple feedback and feed-forward loops. Farrell and Hooker (2013) specifically state for wicked problems “it will often be impossible to disentangle the consequences of specific actions from those of other co-occurring interactions”. This paper discusses the application of an Enterprise Information Architecture modelling technique to disentangle the wicked problem complexity for one case. It proposes that such a modelling technique can be applied to other wicked problems and can lay the foundations for proving relevancy to DSR, provide solution pathways for artefact development, and aid to substantiate those elements required to produce Design Theory.
Resumo:
One of the first architects to write a book was Vitruvius, the Roman architect who published De Architectura in the 1st century BC, a book that would become the foundation for Western Architectural Thought. When I was an undergraduate, the history of architecture was taught via a series of books by architects that were at least, if not more significant than the buildings. From De Architectura to Alberti’s rejoinder De re aedificatoria (On the Art of Building) in the fifteenth century, Palladio’s Quattro Libri (The Four Books of Architecture) 1570, and Laugier’s Essai sur l'Architecture 1753. In the 1990s, we treasured the heroic architecture books of the 20th century from Le Corbusier, Vers une Architecture, to Aldo Rossi’s the Architecture of the City, Rem Koolhaas’s Delirious New York, and of course Robert Venturi’s Learning from Las Vegas which for me was the very starting point for the postmodern movement.
Resumo:
Architecture in the South Pacific: The Ocean of Islands recounts the recent developments of the South Pacific and its fascinating architecture. This volume traces the European architectural overlay onto this scattered group of islands as well as the transition of these same islands towards a regional identity that has been fashioned by the remoteness of each location, the incomparable setting, and the distinctive ethnic mix of its inhabitants. A series of themed essays present the story of architectural development in the Solomon Islands, Vanuatu, New Caledonia, Fiji, Wallis and Futuna, Tonga, the Cook Islands, Samoa and American Samoa, and French Polynesia. Recent architecture typifies the evolution of the islands as they have been subjected to the transformative waves of alien trade, religion, colonization, war and tourism, followed by post-colonialism and revived nationalism. As with the Pacific region itself, the most prominent characteristic of the architecture is its diversity. The blending of the universal and the local sets the stage for a fresh vision of the South Pacific across a wide range of building types, from spectacular mission churches to sensational resorts in paradise. This book, in full colour, will appeal to architects, armchair-tourists, students and all those for whom the South Pacific is the idyll of their dreams.
Resumo:
With increased consolidation and a few large vendors dominating the market, how can software vendors distinguish themselves in order to maintain profitability and gain market share? Increasingly customers are becoming more proactive in selecting a vendor and a product, drawing upon various publications, market surveys, mailing lists, and, of course, other users. In particular, though, a company's Web site is the obvious place to begin information gathering. In sum, it may seem that the days of the uninformed customer prepared to be "sold to" are potentially all but gone.
Resumo:
The adoption of packaged software is becoming increasingly common in a variety of organizations and much of the packaged software literature presents this as a straightforward, linear process based on rationalistic evaluation. This paper applies the framework of power relations developed by Markus and Bjørn-‐Anderson (1987) to a longitudinal study concerning the adoption of a customer relationship management package in a small organization. This is used to highlight both overt and covert power issues within the selection and procurement of the product and illustrate the interplay of power between senior management, IT managers, IT vendors and consultants, and end-‐users. The paper contributes to the growing body of literature on packaged software and also to our understanding of how power is deeply embedded within the surrounding processes.
Resumo:
Purpose – This paper seeks to analyse the process of packaged software selection in a small organization, focussing particularly on the role of IT consultants as intermediaries in the process. Design/methodology/approach – This is based upon a longitudinal, qualitative field study concerning the adoption of a customer relationship management package in an SME management consultancy. Findings – The authors illustrate how the process of “salesmanship”, an activity directed by the vendor/consultant and focussed on the interests of senior management, marginalises user needs and ultimately secures the procurement of the software package. Research limitations/implications – Despite the best intentions the authors lose something of the rich detail of the lived experience of technology in presenting the case study as a linear narrative. Specifically, the authors have been unable to do justice to the complexity of the multifarious ways in which individual perceptions of the project were influenced and shaped by the opinions of others. Practical implications – Practitioners, particularly those from within SMEs, should be made aware of the ways in which external parties may have a vested interest in steering projects in a particular direction, which may not necessarily align with their own interests. Originality/value – This study highlights in detail the role of consultants and vendors in software selection processes, an area which has received minimal attention to date. Prior work in this area emphasises the necessary conditions for, and positive outcomes of, appointing external parties in an SME context, with only limited attention being paid to the potential problems such engagements may bring.