272 resultados para ink reduction software


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Capacity reduction programmes, in the form of buybacks or decommissioning, have had relatively widespread application in fisheries in the US, Europe and Australia. A common criticism of such programmes is that they remove the least efficient vessels first, resulting in an increase in average efficiency of the remaining fleet, which tends to increase the effective fishing power of the remaining fleet. In this paper, the effects of a buyback programme on average technical efficiency in Australia’s Northern Prawn Fishery are examined using a multi-output production function approach with an explicit inefficiency model. As expected, the results indicate that average efficiency of the remaining vessels was generally greater than that of the removed vessels. Further, there was some evidence of an increase in average scale efficiency in the fleet as the remaining vessels were closer, on average, to the optimal scale. Key factors affecting technical efficiency included company structure and the number of vessels fishing. In regard to fleet size, our model suggests positive externalities associated with more boats fishing at any point in time (due to information sharing and reduced search costs), but also negative externalities due to crowding, with the latter effect dominating the former. Hence, the buyback resulted in a net increase in the individual efficiency of the remaining vessels due to reduced crowding, as well as raising average efficiency through removal of less efficient vessels.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Social media tools are starting to become mainstream and those working in the software development industry are often ahead of the game in terms of using current technological innovations to improve their work. With the advent of outsourcing and distributed teams the software industry is ideally placed to take advantage of social media technologies, tools and environments. This paper looks at how social media is being used by early adopters within the software development industry. Current tools and trends in social media tool use are described and critiqued: what works and what doesn't. We use industrial case studies from platform development, commercial application development and government contexts which provide a clear picture of the emergent state of the art. These real world experiences are then used to show how working collaboratively in geographically dispersed teams, enabled by social media, can enhance and improve the development experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose Accelerometers are recognized as a valid and objective tool to assess free-living physical activity. Despite the widespread use of accelerometers, there is no standardized way to process and summarize data from them, which limits our ability to compare results across studies. This paper a) reviews decision rules researchers have used in the past, b) compares the impact of using different decision rules on a common data set, and c) identifies issues to consider for accelerometer data reduction. Methods The methods sections of studies published in 2003 and 2004 were reviewed to determine what decision rules previous researchers have used to identify wearing period, minimal wear requirement for a valid day, spurious data, number of days used to calculate the outcome variables, and extract bouts of moderate to vigorous physical activity (MVPA). For this study, four data reduction algorithms that employ different decision rules were used to analyze the same data set. Results The review showed that among studies that reported their decision rules, much variability was observed. Overall, the analyses suggested that using different algorithms impacted several important outcome variables. The most stringent algorithm yielded significantly lower wearing time, the lowest activity counts per minute and counts per day, and fewer minutes of MVPA per day. An exploratory sensitivity analysis revealed that the most stringent inclusion criterion had an impact on sample size and wearing time, which in turn affected many outcome variables. Conclusions These findings suggest that the decision rules employed to process accelerometer data have a significant impact on important outcome variables. Until guidelines are developed, it will remain difficult to compare findings across studies

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Accelerometers have become one of the most common methods of measuring physical activity (PA). Thus, validity of accelerometer data reduction approaches remains an important research area. Yet, few studies directly compare data reduction approaches and other PA measures in free-living samples. Objective To compare PA estimates provided by 3 accelerometer data reduction approaches, steps, and 2 self-reported estimates: Crouter's 2-regression model, Crouter's refined 2-regression model, the weighted cut-point method adopted in the National Health and Nutrition Examination Survey (NHANES; 2003-2004 and 2005-2006 cycles), steps, IPAQ, and 7-day PA recall. Methods A worksite sample (N = 87) completed online-surveys and wore ActiGraph GT1M accelerometers and pedometers (SW-200) during waking hours for 7 consecutive days. Daily time spent in sedentary, light, moderate, and vigorous intensity activity and percentage of participants meeting PA recommendations were calculated and compared. Results Crouter's 2-regression (161.8 +/- 52.3 minutes/day) and refined 2-regression (137.6 +/- 40.3 minutes/day) models provided significantly higher estimates of moderate and vigorous PA and proportions of those meeting PA recommendations (91% and 92%, respectively) as compared with the NHANES weighted cut-point method (39.5 +/- 20.2 minutes/day, 18%). Differences between other measures were also significant. Conclusions When comparing 3 accelerometer cut-point methods, steps, and self-report measures, estimates of PA participation vary substantially.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background A feature of epithelial to mesenchymal transition (EMT) relevant to tumour dissemination is the reorganization of actin cytoskeleton/focal contacts, influencing cellular ECM adherence and motility. This is coupled with the transcriptional repression of E-cadherin, often mediated by Snail1, Snail2 and Zeb1/δEF1. These genes, overexpressed in breast carcinomas, are known targets of growth factor-initiated pathways, however it is less clear how alterations in ECM attachment cross-modulate to regulate these pathways. EGF induces EMT in the breast cancer cell line PMC42-LA and the kinase inhibitor staurosporine (ST) induces EMT in embryonic neural epithelial cells, with F-actin de-bundling and disruption of cell-cell adhesion, via inhibition of aPKC. Methods PMC42-LA cells were treated for 72 h with 10 ng/ml EGF, 40 nM ST, or both, and assessed for expression of E-cadherin repressor genes (Snail1, Snail2, Zeb1/δEF1) and EMT-related genes by QRT-PCR, multiplex tandem PCR (MT-PCR) and immunofluorescence +/- cycloheximide. Actin and focal contacts (paxillin) were visualized by confocal microscopy. A public database of human breast cancers was assessed for expression of Snail1 and Snail2 in relation to outcome. Results When PMC42-LA were treated with EGF, Snail2 was the principal E-cadherin repressor induced. With ST or ST+EGF this shifted to Snail1, with more extreme EMT and Zeb1/δEF1 induction seen with ST+EGF. ST reduced stress fibres and focal contact size rapidly and independently of gene transcription. Gene expression analysis by MT-PCR indicated that ST repressed many genes which were induced by EGF (EGFR, CAV1, CTGF, CYR61, CD44, S100A4) and induced genes which alter the actin cytoskeleton (NLF1, NLF2, EPHB4). Examination of the public database of breast cancers revealed tumours exhibiting higher Snail1 expression have an increased risk of disease-recurrence. This was not seen for Snail2, and Zeb1/δEF1 showed a reverse correlation with lower expression values being predictive of increased risk. Conclusion ST in combination with EGF directed a greater EMT via actin depolymerisation and focal contact size reduction, resulting in a loosening of cell-ECM attachment along with Snail1-Zeb1/δEF1 induction. This appeared fundamentally different to the EGF-induced EMT, highlighting the multiple pathways which can regulate EMT. Our findings add support for a functional role for Snail1 in invasive breast cancer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

QUT Software Finder is a searchable repository of metadata describing software and source code, which has been created as a result of QUT research activities. It was launched in December 2013. https://researchdatafinder.qut.edu.au/scf The registry was designed to aid the discovery and visibility of QUT research outputs and encourage sharing and re-use of code and software throughout the research community, both nationally and internationally. The repository platform used is VIVO (an open source product initially developed at Cornell University). QUT Software Finder records that describe software or code are connected to information about researchers involved, the research groups, related publications and related projects. Links to where the software or code can be accessed from are also provided alongside licencing and re-use information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The cotton strip assay (CSA) is an established technique for measuring soil microbial activity. The technique involves burying cotton strips and measuring their tensile strength after a certain time. This gives a measure of the rotting rate, R, of the cotton strips. R is then a measure of soil microbial activity. This paper examines properties of the technique and indicates how the assay can be optimised. Humidity conditioning of the cotton strips before measuring their tensile strength reduced the within and between day variance and enabled the distribution of the tensile strength measurements to approximate normality. The test data came from a three-way factorial experiment (two soils, two temperatures, three moisture levels). The cotton strips were buried in the soil for intervals of time ranging up to 6 weeks. This enabled the rate of loss of cotton tensile strength with time to be studied under a range of conditions. An inverse cubic model accounted for greater than 90% of the total variation within each treatment combination. This offers support for summarising the decomposition process by a single parameter R. The approximate variance of the decomposition rate was estimated from a function incorporating the variance of tensile strength and the differential of the function for the rate of decomposition, R, with respect to tensile strength. This variance function has a minimum when the measured strength is approximately 2/3 that of the original strength. The estimates of R are almost unbiased and relatively robust against the cotton strips being left in the soil for more or less than the optimal time. We conclude that the rotting rate X should be measured using the inverse cubic equation, and that the cotton strips should be left in the soil until their strength has been reduced to about 2/3.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational optimisation of clinically important electrocardiogram signal features, within a single heart beat, using a Markov-chain Monte Carlo (MCMC) method is undertaken. A detailed, efficient data-driven software implementation of an MCMC algorithm has been shown. Initially software parallelisation is explored and has been shown that despite the large amount of model parameter inter-dependency that parallelisation is possible. Also, an initial reconfigurable hardware approach is explored for future applicability to real-time computation on a portable ECG device, under continuous extended use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Engineering design processes are necessary to attain the requisite standards of integrity for high-assurance safety-related systems. Additionally, human factors design initiatives can provide critical insights that parameterise their development. Unfortunately, the popular perception of human factors as a “forced marriage” between engineering and psychology often provokes views where the ‘human factor’ is perceived as a threat to systems design. Some popular performance-based standards for developing safety-related systems advocate identifying and managing human factors throughout the system lifecycle. However, they also have a tendency to fall short in their guidance on the application of human factors methods and tools, let alone how the outputs generated can be integrated in to various stages of the design process. This case study describes a project that converged engineering with human factors to develop a safety argument for new low-cost railway level crossing technology for system-wide implementation in Australia. The paper enjoins the perspectives of a software engineer and cognitive psychologist and their involvement in the project over two years of collaborative work to develop a safety argument for low-cost level crossing technology. Safety and reliability requirements were informed by applying human factors analytical tools that supported the evaluation and quantification of human reliability where users interfaced with the technology. The project team was confronted with significant challenges in cross-disciplinary engagement, particularly with the complexities of dealing with incongruences in disciplinary language. They were also encouraged to think ‘outside the box’ as to how users of a system interpreted system states and ehaviour. Importantly, some of these states, while considered safe within the boundary of the constituent systems that implemented safety-related functions, could actually lead the users to engage in deviant behaviour. Psychology explained how user compliance could be eroded to levels that effectively undermined levels of risk reduction afforded by systems. Linking the engineering and psychology disciplines intuitively, overall safety performance was improved by introducing technical requirements and making design decisions that minimized the system states and behaviours that led to user deviancy. As a commentary on the utility of transdisciplinary collaboration for technical specification, the processes used to bridge the two disciplines are conceptualised in a graphical model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

language (such as C++ and Java). The model used allows to insert watermarks on three “orthogonal” levels. For the first level, watermarks are injected into objects. The second level watermarking is used to select proper variants of the source code. The third level uses transition function that can be used to generate copies with different functionalities. Generic watermarking schemes were presented and their security discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article elucidates and analyzes the fundamental underlying structure of the renormalization group (RG) approach as it applies to the solution of any differential equation involving multiple scales. The amplitude equation derived through the elimination of secular terms arising from a naive perturbation expansion of the solution to these equations by the RG approach is reduced to an algebraic equation which is expressed in terms of the Thiele semi-invariants or cumulants of the eliminant sequence { Zi } i=1 . Its use is illustrated through the solution of both linear and nonlinear perturbation problems and certain results from the literature are recovered as special cases. The fundamental structure that emerges from the application of the RG approach is not the amplitude equation but the aforementioned algebraic equation. © 2008 The American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sub-oxide-to-metallic highly-crystalline nanowires with uniformly distributed nanopores in the 3 nm range have been synthesized by a unique combination of the plasma oxidation, re-deposition and electron-beam reduction. Electron beam exposure-controlled oxide → sub-oxide → metal transition is explained using a non-equilibrium model.