10 resultados para Game of rules
em Aston University Research Archive
Resumo:
The penalty kick in football is a seemingly simplistic play; however, it has increased in complexity since 1997 when the rules changed allowing goalkeepers to move laterally along their goal line before the ball was kicked. Prior to 1997 goalkeepers were required to remain still until the ball was struck. The objective of this study was to determine the importance of the penalty kick in the modern game of football. A retrospective study of the 2002, 2006 and 2010 World Cup and the 2000, 2004 and 2008 European Championship tournaments was carried out, assessing the importance of the penalty kick in match play and shootouts and the effect of the time of the game on the shooter's success rate. This study demonstrated the conversion rate of penalties was 73% in shootouts and 68% in match play. Significantly more penalties were awarded late in the game: twice as many penalties in the second half than the first and close to four times as many in the fourth quarter vs. the first. Teams awarded penalty kicks during match play won 52%, drew 30% and lost 18% of the time; chances of winning increased to 61% if the penalty was scored, but decreased to 29% if missed. Teams participating in either the World Cup or European Championship final match had roughly a 50% chance of being involved in a penalty shootout during the tournament. Penalty shots and their outcome significantly impact match results in post 1997 football.
Resumo:
Anyone who looks at the title of this special issue will agree that the intent behind the preparation of this volume was ambitious: to predict and discuss “The Future of Manufacturing”. Will manufacturing be important in the future? Even though some sceptics might say not, and put on the table some old familiar arguments, we would strongly disagree. To bring subsidies for the argument we issued the call-for-papers for this special issue of Journal of Manufacturing Technology Management, fully aware of the size of the challenge in our hands. But we strongly believed that the enterprise would be worthwhile. The point of departure is the ongoing debate concerning the meaning and content of manufacturing. The easily visualised internal activity of using tangible resources to make physical products in factories is no longer a viable way to characterise manufacturing. It is now a more loosely defined concept concerning the organisation and management of open, interdependent, systems for delivering goods and services, tangible and intangible, to diverse types of markets. Interestingly, Wickham Skinner is the most cited author in this special issue of JMTM. He provides the departure point of several articles because his vision and insights have guided and inspired researchers in production and operations management from the late 1960s until today. However, the picture that we draw after looking at the contributions in this special issue is intrinsically distinct, much more dynamic, and complex. Seven articles address the following research themes: 1.new patterns of organisation, where the boundaries of firms become blurred and the role of the firm in the production system as well as that of manufacturing within the firm become contingent; 2.new approaches to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface; 3.new challenges in strategic and operational decisions due to changes in the profile of the workforce; 4.new global players, especially China, modifying the manufacturing landscape; and 5.new techniques, methods and tools that are being made feasible through progress in new technological domains. Of course, many other important dimensions could be studied, but these themes are representative of current changes and future challenges. Three articles look at the first theme: organisational evolution of production and operations in firms and networks. Karlsson's and Skold's article represent one further step in their efforts to characterise “the extraprise”. In the article, they advance the construction of a new framework, based on “the network perspective” by defining the formal elements which compose it and exploring the meaning of different types of relationships. The way in which “actors, resources and activities” are conceptualised extends the existing boundaries of analytical thinking in operations management and open new avenues for research, teaching and practice. The higher level of abstraction, an intrinsic feature of the framework, is associated to the increasing degree of complexity that characterises decisions related to strategy and implementation in the manufacturing and operations area, a feature that is expected to become more and more pervasive as time proceeds. Riis, Johansen, Englyst and Sorensen have also based their article on their previous work, which in this case is on “the interactive firm”. They advance new propositions on strategic roles of manufacturing and discuss why the configuration of strategic manufacturing roles, at the level of the network, will become a key issue and how the indirect strategic roles of manufacturing will become increasingly important. Additionally, by considering that value chains will become value webs, they predict that shifts in strategic manufacturing roles will look like a sequence of moves similar to a game of chess. Then, lastly under the first theme, Fleury and Fleury develop a conceptual framework for the study of production systems in general derived from field research in the telecommunications industry, here considered a prototype of the coming information society and knowledge economy. They propose a new typology of firms which, on certain dimensions, complements the propositions found in the other two articles. Their telecoms-based framework (TbF) comprises six types of companies characterised by distinct profiles of organisational competences, which interact according to specific patterns of relationships, thus creating distinct configurations of production networks. The second theme is addressed by Kyläheiko and SandstroÍm in their article “Strategic options based framework for management of dynamic capabilities in manufacturing firms”. They propose a new approach to strategic decision-making in markets characterised by turbulence and weak signals at the customer interface. Their framework for a manufacturing firm in the digital age leads to active asset selection (strategic investments in both tangible and intangible assets) and efficient orchestrating of the global value net in “thin” intangible asset markets. The framework consists of five steps based on Porter's five-forces model, the resources-based view, complemented by means of the concepts of strategic options and related flexibility issues. Thun, GroÍssler and Miczka's contribution to the third theme brings the human dimension to the debate regarding the future of manufacturing. Their article focuses on the challenges brought to management by the ageing of workers in Germany but, in the arguments that are raised, the future challenges associated to workers and work organisation in every production system become visible and relevant. An interesting point in the approach adopted by the authors is that not only the factual problems and solutions are taken into account but the perception of the managers is brought into the picture. China cannot be absent in the discussion of the future of manufacturing. Therefore, within the fourth theme, Vaidya, Bennett and Liu provide the evidence of the gradual improvement of Chinese companies in the medium and high-tech sectors, by using the revealed comparative advantage (RCA) analysis. The Chinese evolution is shown to be based on capabilities developed through combining international technology transfer and indigenous learning. The main implication for the Western companies is the need to take account of the accelerated rhythm of capability development in China. For other developing countries China's case provides lessons of great importance. Finally, under the fifth theme, Kuehnle's article: “Post mass production paradigm (PMPP) trajectories” provides a futuristic scenario of what is already around us and might become prevalent in the future. It takes a very intensive look at a whole set of dimensions that are affecting manufacturing now, and will influence manufacturing in the future, ranging from the application of ICT to the need for social transparency. In summary, this special issue of JMTM presents a brief, but undisputable, demonstration of the possible richness of manufacturing in the future. Indeed, we could even say that manufacturing has no future if we only stick to the past perspectives. Embracing the new is not easy. The new configurations of production systems, the distributed and complementary roles to be performed by distinct types of companies in diversified networked structures, leveraged by the new emergent technologies and associated the new challenges for managing people, are all themes that are carriers of the future. The Guest Editors of this special issue on the future of manufacturing are strongly convinced that their undertaking has been worthwhile.
Resumo:
The thesis examines Kuhn's (1962, 1970) concept of paradigm, assesses how it is employed for mapping intellectual terrain in the social sciences, and evaluates it's use in research based on multiple theory positions. In so doing it rejects both the theses of total paradigm 'incommensurability' (Kuhn, 1962), and also of liberal 'translation' (Popper, 1970), in favour of a middle ground through the 'language-game of everyday life' (Wittgenstein, 1953). The thesis ultimately argues for the possibility of being 'trained-into' new paradigms, given the premise that 'unorganised experience cannot order perception' (Phillips, 1977). In conducting multiple paradigm research the analysis uses the Burrell and Morgan (1979) model for examining the work organisation of a large provincial fire Service. This analysis accounts for firstly, a 'functionalist' assessment of work design, demonstrating inter alia the decrease in reported motivation with length of service; secondly, an 'interpretive' portrayal of the daily accomplishment of task routines, highlighting the discretionary and negotiated nature of the day's events; thirdly, a 'radical humanist' analysis of workplace ideology, demonstrating the hegemonic role of officer training practices; and finally, a 'radical structuralist' description of the labour process, focusing on the establishment of a 'normal working day'. Although the argument is made for the possibility of conducting multiple paradigm research, the conclusion stresses the many institutional pressures serving to offset development.
Resumo:
This paper introduces a mechanism for generating a series of rules that characterize the money price relationship for the USA, defined as the relationship between the rate of growth of the money supply and inflation. Monetary component data is used to train a selection of candidate feedforward neural networks. The selected network is mined for rules, expressed in human-readable and machine-executable form. The rule and network accuracy are compared, and expert commentary is made on the readability and reliability of the extracted rule set. The ultimate goal of this research is to produce rules that meaningfully and accurately describe inflation in terms of the monetary component dataset.
Resumo:
Commentary on target article "From simple associations to systematic reasoning: a connectionist representation of rules, variables, and dynamic bindings using temporal synchrony", by L. Shastri and V. Ajjangadde, pp. 417-494
Resumo:
Derivational morphology proposes meaningful connections between words and is largely unrepresented in lexical databases. This thesis presents a project to enrich a lexical database with morphological links and to evaluate their contribution to disambiguation. A lexical database with sense distinctions was required. WordNet was chosen because of its free availability and widespread use. Its suitability was assessed through critical evaluation with respect to specifications and criticisms, using a transparent, extensible model. The identification of serious shortcomings suggested a portable enrichment methodology, applicable to alternative resources. Although 40% of the most frequent words are prepositions, they have been largely ignored by computational linguists, so addition of prepositions was also required. The preferred approach to morphological enrichment was to infer relations from phenomena discovered algorithmically. Both existing databases and existing algorithms can capture regular morphological relations, but cannot capture exceptions correctly; neither of them provide any semantic information. Some morphological analysis algorithms are subject to the fallacy that morphological analysis can be performed simply by segmentation. Morphological rules, grounded in observation and etymology, govern associations between and attachment of suffixes and contribute to defining the meaning of morphological relationships. Specifying character substitutions circumvents the segmentation fallacy. Morphological rules are prone to undergeneration, minimised through a variable lexical validity requirement, and overgeneration, minimised by rule reformulation and restricting monosyllabic output. Rules take into account the morphology of ancestor languages through co-occurrences of morphological patterns. Multiple rules applicable to an input suffix need their precedence established. The resistance of prefixations to segmentation has been addressed by identifying linking vowel exceptions and irregular prefixes. The automatic affix discovery algorithm applies heuristics to identify meaningful affixes and is combined with morphological rules into a hybrid model, fed only with empirical data, collected without supervision. Further algorithms apply the rules optimally to automatically pre-identified suffixes and break words into their component morphemes. To handle exceptions, stoplists were created in response to initial errors and fed back into the model through iterative development, leading to 100% precision, contestable only on lexicographic criteria. Stoplist length is minimised by special treatment of monosyllables and reformulation of rules. 96% of words and phrases are analysed. 218,802 directed derivational links have been encoded in the lexicon rather than the wordnet component of the model because the lexicon provides the optimal clustering of word senses. Both links and analyser are portable to an alternative lexicon. The evaluation uses the extended gloss overlaps disambiguation algorithm. The enriched model outperformed WordNet in terms of recall without loss of precision. Failure of all experiments to outperform disambiguation by frequency reflects on WordNet sense distinctions.
Resumo:
This paper describes the design and evaluation of Aston-TAC, the runner-up in the Ad Auction Game of 2009 International Trading Agent Competition. In particular, we focus on how Aston-TAC generates adaptive bid prices according to the Market-based Value Per Click and how it selects a set of keyword queries to bid on to maximise the expected profit under limited conversion capacity. Through evaluation experiments, we show that AstonTAC performs well and stably not only in the competition but also across a broad range of environments. © 2010 The authors and IOS Press. All rights reserved.
Resumo:
A review of available literature suggests that social identification exists at the interface between individual and collective identity work. This poster proposes that it is the interaction between these two processes that leads a person to define themselves in terms of their membership of a particular social group. The poster suggests that identity work undertaken by the group (or ‘the creation of identities as widely understood signs with a set of rules and conventions for their use’, Schwalbe & Mason-Schrock, 1996, p.115), can be used by a person to inform their own individual identity work and, from this, the extent of alignment between their identity and the perceived identity of the group. In stable or internally-structured groups collective identity work may simply take the form of communication and preservation of dominant collective identities. However, in unstable, new or transitional groups, interaction between individual and collective identity work may be more dynamic, as both collective and individual identities are simultaneously codified, enacted and refined. To develop an understanding of social identification that is applicable in both stable and transitional social groups, it is useful to consider recent proposals that identification may occur cyclically as a series of discrete episodes (Ashforth, Harrison & Corley, 2008). This poster draws on the literature to present these suggestions in greater detail, outlining propositions for social identification that are relevant to transient as well as stable identity formation, supported by suggestion of how episodes of social identification may lead to a person identifying with a group.
Resumo:
The long-term foetal surveillance is often to be recommended. Hence, the fully non-invasive acoustic recording, through maternal abdomen, represents a valuable alternative to the ultrasonic cardiotocography. Unfortunately, the recorded heart sound signal is heavily loaded by noise, thus the determination of the foetal heart rate raises serious signal processing issues. In this paper, we present a new algorithm for foetal heart rate estimation from foetal phonocardiographic recordings. A filtering is employed as a first step of the algorithm to reduce the background noise. A block for first heart sounds enhancing is then used to further reduce other components of foetal heart sound signals. A complex logic block, guided by a number of rules concerning foetal heart beat regularity, is proposed as a successive block, for the detection of most probable first heart sounds from several candidates. A final block is used for exact first heart sound timing and in turn foetal heart rate estimation. Filtering and enhancing blocks are actually implemented by means of different techniques, so that different processing paths are proposed. Furthermore, a reliability index is introduced to quantify the consistency of the estimated foetal heart rate and, based on statistic parameters; [,] a software quality index is designed to indicate the most reliable analysis procedure (that is, combining the best processing path and the most accurate time mark of the first heart sound, provides the lowest estimation errors). The algorithm performances have been tested on phonocardiographic signals recorded in a local gynaecology private practice from a sample group of about 50 pregnant women. Phonocardiographic signals have been recorded simultaneously to ultrasonic cardiotocographic signals in order to compare the two foetal heart rate series (the one estimated by our algorithm and the other provided by cardiotocographic device). Our results show that the proposed algorithm, in particular some analysis procedures, provides reliable foetal heart rate signals, very close to the reference cardiotocographic recordings. © 2010 Elsevier Ltd. All rights reserved.
Resumo:
To benefit from the advantages that Cloud Computing brings to the IT industry, management policies must be implemented as a part of the operation of the Cloud. Among others, for example, the specification of policies can be used for the management of energy to reduce the cost of running the IT system or also for security policies while handling privacy issues of users. As cloud platforms are large, manual enforcement of policies is not scalable. Hence, autonomic approaches for management policies have recently received a considerable attention. These approaches allow specification of rules that are executed via rule-engines. The process of rules creation starts by the interpretation of the policies drafted by high-rank managers. Then, technical IT staff translate such policies to operational activities to implement them. Such process can start from a textual declarative description and after numerous steps terminates in a set of rules to be executed on a rule engine. To simplify the steps and to bridge the considerable gap between the declarative policies and executable rules, we propose a domain-specific language called CloudMPL. We also design a method of automated transformation of the rules captured in CloudMPL to the popular rule-engine Drools. As the policies are changed over time, code generation will reduce the time required for the implementation of the policies. In addition, using a declarative language for writing the specifications is expected to make the authoring of rules easier. We demonstrate the use of the CloudMPL language into a running example extracted from a management energy consumption case study.