hosted by
publicationslist.org
    

Daniel Howard


dr.daniel.howard@gmail.com

Journal articles

2009
Daniel Howard, Adrian Brezulianu, Joseph Kolibal (2009)  Genetic programming of the stochastic interpolation framework: convection–diffusion equation   Soft Computing - A Fusion of Foundations, Methodologies and Applications Oct  
Abstract: The stochastic interpolation (SI) framework of function recovery from input data comprises a de-convolution step followed by a convolution step with row stochastic matrices generated by a mollifier, such as a probability density function. The choice of a mollifier and of how it gets weighted, offers unprecedented flexibility to vary both the interpolation character and the extent of influence of neighbouring data values. In this respect, a soft computing method such as a genetic algorithm or heuristic method may assist applications that model complex or unknown relationships between data by tuning the parameters, functional and component choices inherent in SI. Alternatively or additionally, the input data itself can be reverse engineered to recover a function that satisfies properties, as illustrated in this paper with a genetic programming scheme that enables SI to recover the analytical solution to a two-point boundary value convection–diffusion differential equation. If further developed, this nascent solution method could serve as an alternative to the weighted residual methods, as these are known to have inherent mathematical difficulties.
Notes:
2008
Joseph Kolibal, Daniel Howard (2008)  Alternative parametric boundary reconstruction method for biomedical imaging.   J Biomed Biotechnol 2008:  
Abstract: Determining the outline or boundary contour of a two-dimensional object, or the surface of a three-dimensional object poses difficulties particularly when there is substantial measurement noise or uncertainty. By adapting the mathematical approach of stochastic function recovery to this task, it is possible to obtain usable estimates for these boundaries, even in the presence of large amounts of noise. The technique is applied to parametric boundary data and has potential applications in biomedical imaging. It should be considered as one of several techniques to improve the visualization of images.
Notes:
Daniel Howard, Simon C Roberts, Conor Ryan, Adrian Brezulianu (2008)  Textural classification of mammographic parenchymal patterns with the SONNET Selforganizing neural network.   J Biomed Biotechnol 2008:  
Abstract: In nationwide mammography screening, thousands of mammography examinations must be processed. Each consists of two standard views of each breast, and each mammogram must be visually examined by an experienced radiologist to assess it for any anomalies. The ability to detect an anomaly in mammographic texture is important to successful outcomes in mammography screening and, in this study, a large number of mammograms were digitized with a highly accurate scanner; and textural features were derived from the mammograms as input data to a SONNET selforganizing neural network. The paper discusses how SONNET was used to produce a taxonomic organization of the mammography archive in an unsupervised manner. This process is subject to certain choices of SONNET parameters, in these numerical experiments using the craniocaudal view, and typically produced O(10), for example, 39 mammogram classes, by analysis of features from O(10(3)) mammogram images. The mammogram taxonomy captured typical subtleties to discriminate mammograms, and it is submitted that this may be exploited to aid the detection of mammographic anomalies, for example, by acting as a preprocessing stage to simplify the task for a computational detection scheme, or by ordering mammography examinations by mammogram taxonomic class prior to screening in order to encourage more successful visual examination during screening. The resulting taxonomy may help train screening radiologists and conceivably help to settle legal cases concerning a mammography screening examination because the taxonomy can reveal the frequency of mammographic patterns in a population.
Notes:
2006
Joseph Kolibal, Daniel Howard (2006)  MALDI-TOF Baseline Drift Removal Using Stochastic Bernstein Approximation   EURASIP Journal on Applied Signal Processing 2006: 63582. 9 jan  
Abstract: Stochastic Bernstein (SB) approximation can tackle the problem of baseline drift correction of instrumentation data. This is demonstrated for spectral data: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF) data. Two SB schemes for removing the baseline drift are presented: iterative and direct. Following an explanation of the origin of the MALDI-TOF baseline drift that sheds light on the inherent difficulty of its removal by chemical means, SB baseline drift removal is illustrated for both proteomics and genomics MALDI-TOF data sets. SB is an elegant signal processing method to obtain a numerically straightforward baseline shift removal method as it includes a free parameter σ(x) that can be optimized for different baseline drift removal applications. Therefore, research that determines putative biomarkers from the spectral data might benefit from a sensitivity analysis to the underlying spectral measurement that is made possible by varying the SB free parameter. This can be manually tuned (for constant σ) or tuned with evolutionary computation (for σ(x)).
Notes:
Daniel Howard, Simon C Roberts, Conor Ryan (2006)  Pragmatic Genetic Programming strategy for the problem of vehicle detection in airborne reconnaissance   Pattern Recognition Letters 27: 11. 1275-1288 Aug  
Abstract: A Genetic Programming (GP) method uses multiple runs, data decomposition stages, to evolve a hierarchical set of vehicle detectors for the automated inspection of infrared line scan imagery that has been obtained by a low flying aircraft. The performance on the scheme using two different sets of GP terminals (all are rotationally invariant statistics of pixel data) is compared on 10 images. The discrete Fourier transform set is found to be marginally superior to the simpler statistics set that includes an edge detector. An analysis of detector formulae provides insight on vehicle detection principles. In addition, a promising family of algorithms that take advantage of the GP method's ability to prescribe an advantageous solution architecture is developed as a post-processor. These algorithms selectively reduce false alarms by exploring context, and determine the amount of contextual information that is required for this task.
Notes:
2003
Daniel Howard, Karl Benson (2003)  Evolutionary computation method for pattern recognition of cis-acting sites.   Biosystems 72: 1-2. 19-27 Nov  
Abstract: This paper develops an evolutionary method that learns inductively to recognize the makeup and the position of very short consensus sequences, cis-acting sites, which are a typical feature of promoters in genomes. The method combines a Finite State Automata (FSA) and Genetic Programming (GP) to discover candidate promoter sequences in primary sequence data. An experiment measures the success of the method for promoter prediction in the human genome. This class of method can take large base pair jumps and this may enable it to process very long genomic sequences to discover gene specific cis-acting sites, and genes which are regulated together.
Notes:
1999
Daniel Howard, Simon C Roberts, Richard Brankin (1999)  Target detection in SAR imagery by genetic programming   Advances in Engineering Software 30: 5. 303-311  
Abstract: The automatic detection of ships in low-resolution synthetic aperture radar (SAR) imagery is investigated in this article. The detector design objectives are to maximise detection accuracy across multiple images, to minimise the computational effort during image processing, and to minimise the effort during the design stage. The results of an extensive numerical study show that a novel approach, using genetic programming (GP), successfully evolves detectors which satisfy the earlier objectives. Each detector represents an algebraic formula and thus the principles of detection can be discovered and reused. This is a major advantage over artificial intelligence techniques which use more complicated representations, e.g. neural networks.
Notes:
Powered by publicationslist.org.