901 resultados para design or documentation process
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-06
Resumo:
Thesis (Master's)--University of Washington, 2016-06
Resumo:
Network building and exchange of information by people within networks is crucial to the innovation process. Contrary to older models, in social networks the flow of information is noncontinuous and nonlinear. There are critical barriers to information flow that operate in a problematic manner. New models and new analytic tools are needed for these systems. This paper introduces the concept of virtual circuits and draws on recent concepts of network modelling and design to introduce a probabilistic switch theory that can be described using matrices. It can be used to model multistep information flow between people within organisational networks, to provide formal definitions of efficient and balanced networks and to describe distortion of information as it passes along human communication channels. The concept of multi-dimensional information space arises naturally from the use of matrices. The theory and the use of serial diagonal matrices have applications to organisational design and to the modelling of other systems. It is hypothesised that opinion leaders or creative individuals are more likely to emerge at information-rich nodes in networks. A mathematical definition of such nodes is developed and it does not invariably correspond with centrality as defined by early work on networks.
Resumo:
Various factors can influence the population dynamics of phytophages post introduction, of which climate is fundamental. Here we present an approach, using a mechanistic modelling package (CLIMEX), that at least enables one to make predictions of likely dynamics based on climate alone. As biological control programs will have minimal funding for basic work (particularly on population dynamics), we show how predictions can be made using a species geographical distribution, relative abundance across its range, seasonal phenology and laboratory rearing data. Many of these data sets are more likely to be available than long-term population data, and some can be incorporated into the exploratory phase of a biocontrol program. Although models are likely to be more robust the more information is available, useful models can be developed using information on species distribution alone. The fitted model estimates a species average response to climate, and can be used to predict likely geographical distribution if introduced, where the agent is likely to be more abundant (i.e. good locations) and more importantly for interpretation of release success, the likely variation in abundance over time due to intra- and inter-year climate variability. The latter will be useful in predicting both the seasonal and long-term impacts of the potential biocontrol agent on the target weed. We believe this tool may not only aid in the agent selection process, but also in the design of release strategies, and for interpretation of post-introduction dynamics and impacts. More importantly we are making testable predictions. If biological control is to become more of a science making and testing such hypothesis will be a key component.
Resumo:
A modern mineral processing plant represents a substantial investment. During the design process, there is often a period when costs (or revenues) must be compensated for by cuts in capital expenditure. In many cases, sampling and measurement equipment provides a soft target for such 'savings'. This process is almost analgous to reducing the capital investment in a corner store by not including a cash register. The consequences will be quite similar - a serious lack of sound performance data and plenty of opportunities for theft - deliberate or inadvertent. This paper makes the case that investment in sampling and measurement equipment is more cost-effective during the design phase. Further, a strong measurement culture will have many benefits including the ability to take advantage of small gains. In almost any business, there are many more opportunities to make small gains than to make large, step changes. In short, if a project cannot justify the cost of accurate and reliable measurement of its performance, it probably should not be a project at all.
Resumo:
Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.