647 resultados para attack models
Resumo:
Sexual segregation is best known in sexually dimorphic ungulates. Many hypotheses have been proposed to explain the evolution of sexual segregation in ungulates, but all are reducible to the influence of two factors: body size and sex-specific reproductive strategy. Definitive tests of these hypotheses are lacking in ungulates because these factors are confounded, all males being somewhat larger than females. Kangaroos represent a parallel radiation of terrestrial herbivores, but their populations are composed of a spectrum of adult body sizes, ranging from small males the same size as females to large males more than twice the size. We exploited this heteromorphism to assess the independent influences of size and sex in these ungulate analogues. We conducted a preliminary study of western grey kangaroos (Macropus fuliginosus) in north-western Victoria, Australia. Adult males predominately occupied grassland habitat, whereas females occurred mostly in lakebed, woodland and shrubland. Single-sex groups occurred more often than expected during the non-mating season. The diet of large males had the highest proportion of grass, and females had the least. These initial results indicate that both size and sex influence segregation in this species, confirming the worth of kangaroos as marsupial models for research into the evolution of sexual segregation.
Resumo:
Purpose: Increasing numbers of haematology cancer survivors warrants identification of the most effective model of survivorship care to survivors from a diverse range of haematological cancers with aggressive treatment regimens. This review aimed to identify models of survivorship care to support the needs of haematology cancer survivors. Methods: An integrative literature review method utilised a search of electronic databases (CINAHL, Medline, PsycInfo, PubMed, EMBASE, PsycArticles, Cochrane Library) for eligible articles (up to July 2014). Articles were included if they proposed or reported the use of a model of care for haematology cancer survivors. Results: Fourteen articles were included in this review. Eight articles proposed and described models of care and six reported the use of a range of survivorship models of care in haematology cancer survivors. No randomised controlled trials or literature reviews were found to have been undertaken specifically with this cohort of cancer survivors. There was variation in the models described and who provided the survivorship care. Conclusion: Due to the lack of studies evaluating the effectiveness of models of care, it is difficult to determine the best model of care for haematology cancer survivors. Many different models of care are being put into practice before robust research is conducted. Therefore well-designed high quality pragmatic randomised controlled trials are required to inform clinical practice.
Resumo:
Iterative computational models have been used to investigate the regulation of bone fracture healing by local mechanical conditions. Although their predictions replicate some mechanical responses and histological features, they do not typically reproduce the predominantly radial hard callus growth pattern observed in larger mammals. We hypothesised that this discrepancy results from an artefact of the models’ initial geometry. Using axisymmetric finite element models, we demonstrated that pre-defining a field of soft tissue in which callus may develop introduces high deviatoric strains in the periosteal region adjacent to the fracture. These bone-inhibiting strains are not present when the initial soft tissue is confined to a thin periosteal layer. As observed in previous healing models, tissue differentiation algorithms regulated by deviatoric strain predicted hard callus forming remotely and growing towards the fracture. While dilatational strain regulation allowed early bone formation closer to the fracture, hard callus still formed initially over a broad area, rather than expanding over time. Modelling callus growth from a thin periosteal layer successfully predicted the initiation of hard callus growth close to the fracture site. However, these models were still susceptible to elevated deviatoric strains in the soft tissues at the edge of the hard callus. Our study highlights the importance of the initial soft tissue geometry used for finite element models of fracture healing. If this cannot be defined accurately, alternative mechanisms for the prediction of early callus development should be investigated.
Resumo:
Introduction Hydrogels prepared from star-shaped poly(ethylene glycol) (PEG) and maleimide-functionalized heparin provide a potential matrix for use in developing three dimensional (3D) models. We have previously demonstrated that these hydrogels support the cultivation of human umbilical vein endothelial cells (HUVECs). We extend this body of work to study the ability to create an extracellular matrix (ECM)-like model to study breast and prostate cancer cell growth in 3D. Also, we investigate the ability to produce a tri-culture mimicking tumour angiogenesis with cancer spheroids, HUVECs and mesenchymal stem cells (MSCs). Materials and Methods The breast cancer cell lines, MCF-7 and MDA-MB-231, and prostate cancer cell lines, LNCaP and PC3, were seeded into starPEG-heparin hydrogels and grown for 14 Days to analyze the effects of varying hydrogel stiffness on spheroid development. Resulting hydrogel constructs were analyzed via proliferation assays, light microscopy, and immunostaining. Cancer cell lines were then seeded into starPEG-heparin hydrogels functionalized with growth factors as spheroids with HUVECs and MSCs and grown as a tri-culture. Cultures were analyzed via immunostaining and observed using confocal microscopy. Results Cultures prepared in MMP-cleavable starPEG-heparin hydrogels display spheroid formation in contrast to adherent growth on tissue culture plastic. Small differences were visualized in cancer spheroid growth between different gel stiffness across the range of cell lines. Cancer cell lines were able to be co-cultivated with HUVECs and MSC. Interaction was visualized between tumours and HUVECs via confocal microscopy. Further studies intend to further optimize and mimic the ECM environment of in-situ tumour angiogenesis. Discussion Our results confirm the suitability of hydrogels constructed from starPEG-heparin for HUVEC and MSC co-cultivation with cancer cell lines to study cell-cell and cell-matrix interactions in a 3D environment. This represents a step forward in the development of 3D culture models to study the pathomechanisms of breast and prostate cancer.
Resumo:
This project investigated the calcium distributions of the skin, and the growth patterns of skin substitutes grown in the laboratory, using mathematical models. The research found that the calcium distribution in the upper layer of the skin is controlled by three different mechanisms, not one as previously thought. The research also suggests that tight junctions, which are adhesions between neighbouring skin cells, cannot be solely responsible for the differences in the growth patterns of skin substitutes and normal skin.
Resumo:
1.Description of the Work The Fleet Store was devised as a creative output to establish an exhibition linked to a fashion business model where emerging designers were encouraged to research new and innovative strategies for creating design-driven and commercial collections for a public consumer. This was a project that was devised to break down the perceptions of emerging fashion designers that designing commercial collections linked to a sustainable business model is a boring and unnecessary process. The focus was to demystify the business of fashion and to link its importance to a design-driven and public outcome that is more familiar to fashion designers. The criterion for participation was that all designers had to be registered as a business with the Australian Taxation Office. Designers were chosen from the Creative Enterprise Australia Fashion Business Incubator, the QUT fashion graduate alumni and current QUT fashion design and double degree (fashion and business) students with existing businesses. The project evolved from a series of collaborative workshops where designers were introduced to new and innovative creative industries’ business models and the processes, costings and timings involved to create a niche, sustainable business for a public exhibition of design-driven commercial collections. All designers initiated their own business infra-structure but were then introduced to the concept of collaboration for successful and profitable exhibition and business outcomes. Collaborative strategies such as crowd funding, crowd sourcing, peer to peer mentoring and manufacturing were all researched, and strategies for the establishment of the retail exhibition were all devised in a collaborative environment. All participants also took on roles outside their ‘designer’ background to create a retail exhibition that was creative but also had critical mass and aesthetic for the consumer. The Fleet Store ‘popped up’ for 2 weeks (10 days), in a heritage-listed building in an inner city location. Passers-by were important, but the main consumer was enlisted by the use of interest and investment from crowd sourcing, crowd funding, ethical marketing, corporate social responsibility projects and collaborative public relations and social media strategies. The research has furthered discussion on innovative strategies for emerging fashion designers to initiate and maintain sustainable businesses and suggests that collaboration combined with a design-driven and business focus can create a sustainable and economically viable retail exhibition. 2. Research Statement Research Background The research field involved developing a new ethical, design-driven, collaborative and sustainable model for fashion design practice and management. The research asked can a public, design-driven, collaborative retail exhibition create a platform for promoting creative, innovative and sustainable business models for emerging fashion designers. The methodology was primarily practice-led as all participants were designers in their own right and the project manager acted as a mentor and curator to guide the process and analyse the potential of the research question. The Fleet Store offers new knowledge in design practice and management; with the creation of a model where design outcomes and business models are inextricably linked to the success of the creative output. Key innovations include extending the commercialisation of emerging fashion businesses by creating a curated retail gallery for collaborative and sustainable strategies to support niche fashion designer labels. This has contributed to a broader conversation on how to nurture and sustain competitive Australian fashion designers/labels. Research Contribution and Significance The Fleet Store has contributed to a growing body of research into innovative and sustainable business models for niche fashion and creative industries’ practitioners. All participants have maintained their business infra-structure and many are currently growing their businesses, using the strategies tested for the Fleet Store. The exhibition space was visited by over 1,000 people and sales of $27,000 were made in 10 days of opening. (Follow up sales of $3,000 has also been reported.) Three of the designers were ‘discovered’ from the exhibition and have received substantial orders from high profile national buyers and retailers for next season delivery. Several participants have since collaborated to create other pop up retail environments and are now mentoring other emerging designers on the significance of a collaborative retail exhibition to consolidate niche business models for emerging fashion designers.
Resumo:
Local spatio-temporal features with a Bag-of-visual words model is a popular approach used in human action recognition. Bag-of-features methods suffer from several challenges such as extracting appropriate appearance and motion features from videos, converting extracted features appropriate for classification and designing a suitable classification framework. In this paper we address the problem of efficiently representing the extracted features for classification to improve the overall performance. We introduce two generative supervised topic models, maximum entropy discrimination LDA (MedLDA) and class- specific simplex LDA (css-LDA), to encode the raw features suitable for discriminative SVM based classification. Unsupervised LDA models disconnect topic discovery from the classification task, hence yield poor results compared to the baseline Bag-of-words framework. On the other hand supervised LDA techniques learn the topic structure by considering the class labels and improve the recognition accuracy significantly. MedLDA maximizes likelihood and within class margins using max-margin techniques and yields a sparse highly discriminative topic structure; while in css-LDA separate class specific topics are learned instead of common set of topics across the entire dataset. In our representation first topics are learned and then each video is represented as a topic proportion vector, i.e. it can be comparable to a histogram of topics. Finally SVM classification is done on the learned topic proportion vector. We demonstrate the efficiency of the above two representation techniques through the experiments carried out in two popular datasets. Experimental results demonstrate significantly improved performance compared to the baseline Bag-of-features framework which uses kmeans to construct histogram of words from the feature vectors.
Resumo:
Railway capacity determination and expansion are very important topics. In prior research, the competition between different entities such as train services and train types, on different network corridors however have been ignored, poorly modelled, or else assumed to be static. In response, a comprehensive set of multi-objective models have been formulated in this article to perform a trade-off analysis. These models determine the total absolute capacity of railway networks as the most equitable solution according to a clearly defined set of competing objectives. The models also perform a sensitivity analysis of capacity with respect to those competing objectives. The models have been extensively tested on a case study and their significant worth is shown. The models were solved using a variety of techniques however an adaptive E constraint method was shown to be most superior. In order to identify only the best solution, a Simulated Annealing meta-heuristic was implemented and tested. However a linearization technique based upon separable programming was also developed and shown to be superior in terms of solution quality but far less in terms of computational time.
Resumo:
Traditional sensitivity and elasticity analyses of matrix population models have been used to inform management decisions, but they ignore the economic costs of manipulating vital rates. For example, the growth rate of a population is often most sensitive to changes in adult survival rate, but this does not mean that increasing that rate is the best option for managing the population because it may be much more expensive than other options. To explore how managers should optimize their manipulation of vital rates, we incorporated the cost of changing those rates into matrix population models. We derived analytic expressions for locations in parameter space where managers should shift between management of fecundity and survival, for the balance between fecundity and survival management at those boundaries, and for the allocation of management resources to sustain that optimal balance. For simple matrices, the optimal budget allocation can often be expressed as simple functions of vital rates and the relative costs of changing them. We applied our method to management of the Helmeted Honeyeater (Lichenostomus melanops cassidix; an endangered Australian bird) and the koala (Phascolarctos cinereus) as examples. Our method showed that cost-efficient management of the Helmeted Honeyeater should focus on increasing fecundity via nest protection, whereas optimal koala management should focus on manipulating both fecundity and survival simultaneously. These findings are contrary to the cost-negligent recommendations of elasticity analysis, which would suggest focusing on managing survival in both cases. A further investigation of Helmeted Honeyeater management options, based on an individual-based model incorporating density dependence, spatial structure, and environmental stochasticity, confirmed that fecundity management was the most cost-effective strategy. Our results demonstrate that decisions that ignore economic factors will reduce management efficiency. ©2006 Society for Conservation Biology.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
The NLM stream cipher designed by Hoon Jae Lee, Sang Min Sung, Hyeong Rag Kim is a strengthened version of the LM summation generator that combines linear and non-linear feedback shift registers. In recent works, the NLM cipher has been used for message authentication in lightweight communication over wireless sensor networks and for RFID authentication protocols. The work analyses the security of the NLM stream cipher and the NLM-MAC scheme that is built on the top of the NLM cipher. We first show that the NLM cipher suffers from two major weaknesses that lead to key recovery and forgery attacks. We prove the internal state of the NLM cipher can be recovered with time complexity about nlog7×2, where the total length of internal state is 2⋅n+22⋅n+2 bits. The attack needs about n2n2 key-stream bits. We also show adversary is able to forge any MAC tag very efficiently by having only one pair (MAC tag, ciphertext). The proposed attacks are practical and break the scheme with a negligible error probability.
Resumo:
This thesis focused upon the development of improved capacity analysis and capacity planning techniques for railways. A number of innovations were made and were tested on a case study of a real national railway. These techniques can reduce the time required to perform decision making activities that planners and managers need to perform. As all railways need to be expanded to meet increasing demands, the presumption that analytical capacity models can be used to identify how best to improve an existing network at least cost, was fully investigated. Track duplication was the mechanism used to expanding a network's capacity, and two variant capacity expansion models were formulated. Another outcome of this thesis is the development and validation of bi objective models for capacity analysis. These models regulate the competition for track access and perform a trade-off analysis. An opportunity to develop more general mulch-objective approaches was identified.
Resumo:
Wound healing and tumour growth involve collective cell spreading, which is driven by individual motility and proliferation events within a population of cells. Mathematical models are often used to interpret experimental data and to estimate the parameters so that predictions can be made. Existing methods for parameter estimation typically assume that these parameters are constants and often ignore any uncertainty in the estimated values. We use approximate Bayesian computation (ABC) to estimate the cell diffusivity, D, and the cell proliferation rate, λ, from a discrete model of collective cell spreading, and we quantify the uncertainty associated with these estimates using Bayesian inference. We use a detailed experimental data set describing the collective cell spreading of 3T3 fibroblast cells. The ABC analysis is conducted for different combinations of initial cell densities and experimental times in two separate scenarios: (i) where collective cell spreading is driven by cell motility alone, and (ii) where collective cell spreading is driven by combined cell motility and cell proliferation. We find that D can be estimated precisely, with a small coefficient of variation (CV) of 2–6%. Our results indicate that D appears to depend on the experimental time, which is a feature that has been previously overlooked. Assuming that the values of D are the same in both experimental scenarios, we use the information about D from the first experimental scenario to obtain reasonably precise estimates of λ, with a CV between 4 and 12%. Our estimates of D and λ are consistent with previously reported values; however, our method is based on a straightforward measurement of the position of the leading edge whereas previous approaches have involved expensive cell counting techniques. Additional insights gained using a fully Bayesian approach justify the computational cost, especially since it allows us to accommodate information from different experiments in a principled way.
Resumo:
PURPOSE: This paper describes dynamic agent composition, used to support the development of flexible and extensible large-scale agent-based models (ABMs). This approach was motivated by a need to extend and modify, with ease, an ABM with an underlying networked structure as more information becomes available. Flexibility was also sought after so that simulations are set up with ease, without the need to program. METHODS: The dynamic agent composition approach consists in having agents, whose implementation has been broken into atomic units, come together at runtime to form the complex system representation on which simulations are run. These components capture information at a fine level of detail and provide a vast range of combinations and options for a modeller to create ABMs. RESULTS: A description of the dynamic agent composition is given in this paper, as well as details about its implementation within MODAM (MODular Agent-based Model), a software framework which is applied to the planning of the electricity distribution network. Illustrations of the implementation of the dynamic agent composition are consequently given for that domain throughout the paper. It is however expected that this approach will be beneficial to other problem domains, especially those with a networked structure, such as water or gas networks. CONCLUSIONS: Dynamic agent composition has many advantages over the way agent-based models are traditionally built for the users, the developers, as well as for agent-based modelling as a scientific approach. Developers can extend the model without the need to access or modify previously written code; they can develop groups of entities independently and add them to those already defined to extend the model. Users can mix-and-match already implemented components to form large-scales ABMs, allowing them to quickly setup simulations and easily compare scenarios without the need to program. The dynamic agent composition provides a natural simulation space over which ABMs of networked structures are represented, facilitating their implementation; and verification and validation of models is facilitated by quickly setting up alternative simulations.