898 resultados para Sophistic thought
Resumo:
Purpose Performance heterogeneity between collaborative infrastructure projects is typically examined by considering procurement systems and their governance mechanisms at static points in time. The literature neglects to consider the impact of dynamic learning capability, which is thought to reconfigure governance mechanisms over time in response to evolving market conditions. This conceptual paper proposes a new model to show how continuous joint learning of participant organisations improves project performance. Design/methodology/approach There are two stages of conceptual development. In the first stage, the management literature is analysed to explain the Standard Model of dynamic learning capability that emphasises three learning phases for organisations. This Standard Model is extended to derive a novel Circular Model of dynamic learning capability that shows a new feedback loop between performance and learning. In the second stage, the construction management literature is consulted, adding project lifecycle, stakeholder diversity and three organisational levels to the analysis, to arrive at the Collaborative Model of dynamic learning capability. Findings The Collaborative Model should enable construction organisations to successfully adapt and perform under changing market conditions. The complexity of learning cycles results in capabilities that are imperfectly imitable between organisations, explaining performance heterogeneity on projects. Originality/value The Collaborative Model provides a theoretically substantiated description of project performance, driven by the evolution of procurement systems and governance mechanisms. The Model’s empirical value will be tested in future research.
Resumo:
Introduction Different types of hallucinations are symptomatic of different conditions. Schizotypal hallucinations are unique in that they follow existing delusional narrative patterns: they are often bizarre, they are generally multimodal, and they are particularly vivid (the experience of a newsreader abusing you personally over the TV is both visual and aural. Patients who feel and hear silicone chips under their skin suffer from haptic hallucinations as well as aural ones, etc.) Although there are a number of hypotheses for hallucinations, few cogently grapple the sheer bizarreness of the ones experienced in schizotypal psychosis. Methods A review-based hypothesis, traversing theory from the molecular level to phenomenological expression as a distinct and recognizable symptomatology. Conclusion Hallucinations appear to be caused by a two-fold dysfunction in the mesofrontal dopamine pathway, which is considered here to mediate attention of different types: in the anterior medial frontal lobe, the receptors (largely D1 type) mediate declarative awareness, whereas the receptors in the striatum (largely D2 type) mediate latent awareness of known schemata. In healthy perception, most of the perceptual load is performed by the latter: by the top-down predictive and mimetic engine, with the bottom-up mechanism being used as a secondary tool to bring conscious deliberation to stimuli that fails to match up against expectations. In schizophrenia, the predictive mode is over-stimulated, while the bottom-up feedback mechanism atrophies. The dysfunctional distribution pattern effectively confines dopamine activity to the striatum, thereby stimulating the structural components of thought and behaviour: well-learned routines, narrative structures, lexica, grammar, schemata, archetypes, and other procedural resources. Meanwhile, the loss of activity in the frontal complex reduces the capacity for declarative awareness and for processing anything that fails to meet expectations.
Resumo:
This paper combines experimental data with simple mathematical models to investigate the influence of spray formulation type and leaf character (wettability) on shatter, bounce and adhesion of droplets impacting with cotton, rice and wheat leaves. Impaction criteria that allow for different angles of the leaf surface and the droplet impact trajectory are presented; their predictions are based on whether combinations of droplet size and velocity lie above or below bounce and shatter boundaries. In the experimental component, real leaves are used, with all their inherent natural variability. Further, commercial agricultural spray nozzles are employed, resulting in a range of droplet characteristics. Given this natural variability, there is broad agreement between the data and predictions. As predicted, the shatter of droplets was found to increase as droplet size and velocity increased, and the surface became harder to wet. Bouncing of droplets occurred most frequently on hard to wet surfaces with high surface tension mixtures. On the other hand, a number of small droplets with low impact velocity were observed to bounce when predicted to lie well within the adhering regime. We believe this discrepancy between the predictions and experimental data could be due to air layer effects that were not taken into account in the current bounce equations. Other discrepancies between experiment and theory are thought to be due to the current assumption of a dry impact surface, whereas, in practice, the leaf surfaces became increasingly covered with fluid throughout the spray test runs.
Resumo:
Five significant problems hinder advances in understanding of the volcanology of kimberlites: (1) kimberlite geology is very model driven; (2) a highly genetic terminology drives deposit or facies interpretation; (3) the effects of alteration on preserved depositional textures have been grossly underestimated; (4) the level of understanding of the physical process significance of preserved textures is limited; and, (5) some inferred processes and deposits are not based on actual, modern volcanological processes. These issues need to be addressed in order to advance understanding of kimberlite volcanological pipe forming processes and deposits. The traditional, steep-sided southern African pipe model (Class I) consists of a steep tapering pipe with a deep root zone, a middle diatreme zone and an upper crater zone (if preserved). Each zone is thought to be dominated by distinctive facies, respectively: hypabyssal kimberlite (HK, descriptively called here massive coherent porphyritic kimberlite), tuffisitic kimberlite breccia (TKB, descriptively here called massive, poorly sorted lapilli tuff) and crater zone facies, which include variably bedded pyroclastic kimberlite and resedimented and reworked volcaniclastic kimberlite (RVK). Porphyritic coherent kimberlite may, however, also be emplaced at different levels in the pipe, as later stage intrusions, as well as dykes in the surrounding country rock. The relationship between HK and TKB is not always clear. Sub-terranean fluidisation as an emplacement process is a largely unsubstantiated hypothesis; modern in-vent volcanological processes should initially be considered to explain observed deposits. Crater zone volcaniclastic deposits can occur within the diatreme zone of some pipes, indicating that the pipe was largely empty at the end of the eruption, and subsequently began to fill-in largely through resedimentation and sourcing of pyroclastic deposits from nearby vents. Classes II and III Canadian kimberlite models have a more factual, descriptive basis, but are still inadequately documented given the recency of their discovery. The diversity amongst kimberlite bodies suggests that a three-model classification is an over-simplification. Every kimberlite is altered to varying degrees, which is an intrinsic consequence of the ultrabasic composition of kimberlite and the in-vent context; few preserve original textures. The effects of syn- to post-emplacement alteration on original textures have not been adequately considered to date, and should be back-stripped to identify original textural elements and configurations. Applying sedimentological textural configurations as a guide to emplacement processes would be useful. The traditional terminology has many connotations about spatial position in pipe and of process. Perhaps the traditional terminology can be retained in the industrial situation as a general lithofacies-mining terminological scheme because it is so entrenched. However, for research purposes a more descriptive lithofacies terminology should be adopted to facilitate detailed understanding of deposit characteristics, important variations in these, and the process origins. For example every deposit of TKB is different in componentry, texture, or depositional structure. However, because so many deposits in many different pipes are called TKB, there is an implication that they are all similar and that similar processes were involved, which is far from clear.
Resumo:
The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.
Resumo:
The images from The Ripple Effect appear like they are advertising images but have a deeper social message. They are deliberately confronting, humorous, and thought provoking to create debate on true-life experiences of hospital treatments, recovery and support available in our community. The works in this exhibition carry the hopes and aspirations of a community that is bonded together by its collective experiences, and shares a vision of the resources needed for a productive and healthy recovery.
Resumo:
The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.
Resumo:
Copyright was once one of the more obscure areas of law. It applied primarily to resolve disputes between rival publishers, and there was a time, not too long ago, when ordinary people gave it no thought. Copyright disputes were like subatomic particles: everyone knew that they existed, but nobody had ever seen one. In the digital age, however, copyright has become a heated, passionate, bloody battleground. The 'copyright wars' now pitch readers against authors, pirates against publishers, and content owners against communications providers. Everyone has heard a movie producer decry the rampant infringement of streaming sites, or a music executive suggest that BitTorrent is the end of civilisation as we know it. But everyone infringes copyright on an almost constant basis - streaming amateur videos with a soundtrack that isn't quite licensed, filesharing mp3s, copying LOLcat pictures from Facebook, posting pictures on Pinterest without permission, and so on - and most know full well they're in breach of the law.
Resumo:
This special issue explores the nuances of graduate creative work, the kinds of value that creative graduates add through work of various types, graduate employability issues for creative graduates, emerging and developing creative career identities and the implications for educators who are tasked with developing a capable creative workforce. Extant literature tends to characterise creative careers as either ‘precarious’ and insecure, or as the engine room of the creative economy. However, in actuality, the creative workforce is far more heterogeneous than either of these positions suggest, and creative careers are far more complex and diverse than previously thought. The task of creative educators is also much more challenging than previously supposed.
Resumo:
It’s the stuff of nightmares: your intimate images are leaked and posted online by somebody you thought you could trust. But in Australia, victims often have no real legal remedy for this kind of abuse. This is the key problem of regulating the internet. Often, speech we might consider abusive or offensive isn’t actually illegal. And even when the law technically prohibits something, enforcing it directly against offenders can be difficult. It is a slow and expensive process, and where the offender or the content is overseas, there is virtually nothing victims can do. Ultimately, punishing intermediaries for content posted by third parties isn’t helpful. But we do need to have a meaningful conversation about how we want our shared online spaces to feel. The providers of these spaces have a moral, if not legal, obligation to facilitate this conversation.
Resumo:
The work of French sociologist, anthropologist and philosopher Pierre Bourdieu has been influential across a set of cognate disciplines that can be classified as physical culture studies. Concepts such as field, capital, habitus and symbolic violence have been used as theoretical tools by scholars and students looking to understand the nature and purpose of sport, leisure, physical education and human movement within wider society. Pierre Bourdieu and Physical Culture is the first book to focus on the significance of Bourdieu’s work for, and in, physical culture. Bringing together the work of leading and emerging international researchers, it introduces the core concepts in Bourdieu’s thought and work, and presents a series of fascinating demonstrations of the application of his theory to physical culture studies. A concluding section discusses the inherent difficulties of choosing and using theory to understand the world around us. By providing an in-depth and multi-layered example of how theory can be used across the many and varied components of sport, leisure, physical education and human movement, this book should help all serious students and researchers in physical culture to better understand the importance of social theory in their work.
Resumo:
Cortical connectivity is associated with cognitive and behavioral traits that are thought to vary between sexes. Using high-angular resolution diffusion imaging at 4 Tesla, we scanned 234 young adult twins and siblings (mean age: 23.4 2.0 SD years) with 94 diffusion-encoding directions. We applied a novel Hough transform method to extract fiber tracts throughout the entire brain, based on fields of constant solid angle orientation distribution functions (ODFs). Cortical surfaces were generated from each subject's 3D T1-weighted structural MRI scan, and tracts were aligned to the anatomy. Network analysis revealed the proportions of fibers interconnecting 5 key subregions of the frontal cortex, including connections between hemispheres. We found significant sex differences (147 women/87 men) in the proportions of fibers connecting contralateral superior frontal cortices. Interhemispheric connectivity was greater in women, in line with long-standing theories of hemispheric specialization. These findings may be relevant for ongoing studies of the human connectome.
Resumo:
Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.
Resumo:
Control of iron homeostasis is essential for healthy central nervous system function: iron deficiency is associated with cognitive impairment, yet iron overload is thought to promote neurodegenerative diseases. Specific genetic markers have been previously identified that influence levels of transferrin, the protein that transports iron throughout the body, in the blood and brain. Here, we discovered that transferrin levels are related to detectable differences in the macro- and microstructure of the living brain. We collected brain MRI scans from 615 healthy young adult twins and siblings, of whom 574 were also scanned with diffusion tensor imaging at 4 Tesla. Fiber integrity was assessed by using the diffusion tensor imaging-based measure of fractional anisotropy. In bivariate genetic models based on monozygotic and dizygotic twins, we discovered that partially overlapping additive genetic factors influenced transferrin levels and brain microstructure. We also examined common variants in genes associated with transferrin levels, TF and HFE, and found that a commonly carried polymorphism (H63D at rs1799945) in the hemochromatotic HFE gene was associated with white matter fiber integrity. This gene has a well documented association with iron overload. Our statistical maps reveal previously unknown influences of the same gene on brain microstructure and transferrin levels. This discovery may shed light on the neural mechanisms by which iron affects cognition, neurodevelopment, and neurodegeneration.
Resumo:
Several genetic variants are thought to influence white matter (WM) integrity, measured with diffusion tensor imaging (DTI). Voxel based methods can test genetic associations, but heavy multiple comparisons corrections are required to adjust for searching the whole brain and for all genetic variants analyzed. Thus, genetic associations are hard to detect even in large studies. Using a recently developed multi-SNP analysis, we examined the joint predictive power of a group of 18 cholesterol-related single nucleotide polymorphisms (SNPs) on WM integrity, measured by fractional anisotropy. To boost power, we limited the analysis to brain voxels that showed significant associations with total serum cholesterol levels. From this space, we identified two genes with effects that replicated in individual voxel-wise analyses of the whole brain. Multivariate analyses of genetic variants on a reduced anatomical search space may help to identify SNPs with strongest effects on the brain from a broad panel of genes.