Books

Published:


Cover of UML-Based Software Product Line Engineering with SMarty

UML-Based Software Product Line Engineering with SMarty

Edson OliveiraJr (eds.)
Springer, 2023

DOI: 10.1007/978-3-031-18556-4

Table of Contents (TOC)

This book is about software product lines (SPLs) designed and developed taking UML diagrams as the primary basis, modeled according to a rigorous approach composed of an UML profile and a systematic process for variability management activities, forming the Stereotype-based Management of Variability (SMarty) approach. The book consists of five parts. Part I provides essential concepts on SPL in terms of the first development methodologies. It also introduces variability concepts and discusses SPL architectures finishing with the SMarty approach. Part II is focused on the design, verification and validation of SMarty SPLs, and Part III concentrates on the SPL architecture evolution based on ISO/IEC metrics, the SystEM-PLA method, optimization with the MOA4PLA method, and feature interaction prevention. Next, Part IV presents SMarty as a basis for SPL development, such as, the M-SPLearning SPL for mobile learning applications, the PLeTs SPL for testing tools, the PlugSPL plugin environment for supporting the SPL life cycle, the SyMPLES approach for designing embedded systems with SysML, the SMartySPEM approach for software process lines (SPrL), and re-engineering of class diagrams into an SPL. Eventually, Part V promotes controlled experimentation in UML-based SPLs, presenting essential concepts on how to plan, conduct, and document experiments, as well as showing several experiments carried out with SMarty. This book aims at lecturers, graduate students and experienced practitioners. Lecturers might use the book for graduate level courses about SPL fundamentals and tools; students will learn about the SPL engineering process, variability management, and mass customization; and practitioners will see how to plan the transition from single-product development to an SPL-based process, how to document inherent variability in a given domain, or how to apply controlled experiments to SPLs.


Cover of Controlled Experimentation of Digital Forensics - Towards Formalization for Strengthening Evidence Reproducibility, Reliability, and Transparency

Controlled Experimentation of Digital Forensics - Towards Formalization for Strengthening Evidence Reproducibility, Reliability, and Transparency

Edson OliveiraJr, Thiago J. Silva, Charles V. Neu, Avelino F. Zorzo, Ana H. Mazur
Springer, 2026

DOI:

The book provides a comprehensive reflection on the scientific foundations of evidence-based Digital Forensics as an emerging discipline. It goes beyond the practice-oriented roots of Digital Forensics to present a mature field grounded in reproducible, systematic, and transparent research methods. Its title, Controlled Experimentation of Digital Forensics: Towards Formalization for Strengthening Evidence Reproducibility, Reliability, and Transparency, encapsulates this ambition, emphasizing both the need to formalize experimental processes and the commitment to foster openness and rigor in forensic science. Through conceptual models (ExperDF-CM) and semantic frameworks (ExperDF-Onto), the book proposes practical mechanisms for structuring, documenting, and validating experiments in Digital Forensics. It ultimately situates the discipline within the broader movement of Open Science, aligning its principles with the UNESCO Recommendation on Open Science and the pursuit of socially responsible research.


Book Chapters

  • The Future of Experimentation in Digital Forensics
    Edson OliveiraJr
    In Frontiers of Forensic Science: Innovation, Technology, and Justice. Nova Science Publishers, 2026
    Abstract: Digital Forensics (DF) focuses on identifying, acquiring, analyzing, and reporting information from digital devices, especially in computer crime investigations. The investigative process encompasses phases of identification, preservation, collection, examination, analysis, and presentation, all of which are crucial for maintaining evidence integrity and ensuring legal admissibility. The rise of complex digital environments, such as cloud computing and the Internet of Things (IoT), along with technologies like Artificial Intelligence (AI) and Natural Language Processing (NLP), is reshaping the field. This chapter emphasizes the importance of experimentation in developing and validating DF methodologies. Reliable forensic findings are tied to rigorous experimental processes. Challenges include a lack of standardization, insufficient documentation of experimental elements, and limited realistic datasets for validating forensic tools. Additionally, current forensic tools often lack benchmarking frameworks. To enhance controlled experimentation in DF, the chapter advocates for developing support tools for planning and conducting experiments, fostering openness for reproducibility, and strategically integrating AI and Machine Learning (ML). It also emphasizes the importance of formal experimental designs, adherence to standards, quality assurance, and addressing emerging challenges to advance the field.

  • Leveraging Conceptual Modeling for Experimental Rigor, Transparency, and Reproducibility in Software Engineering and Digital Forensics
    Edson OliveiraJr, Carlos D. Luz, Avelino F. Zorzo
    In Conceptual Models: Frameworks for Understanding Complex Systems and Interdisciplinary Inquiry. Nova Science Publishers, 2026
    DOI:
    Abstract: This chapter presents a comprehensive and integrative account of how conceptual modeling can strengthen rigor, transparency, and reproducibility in empirical research across Software Engineering and Digital Forensics. It examines the epistemological foundations of conceptual modeling, its cognitive and methodological roles, and its evolution into semantic and ontology-driven representations that support both human understanding and machine-actionable knowledge. The chapter introduces detailed conceptual models for controlled experimentation in Software Engineering and Digital Forensics, highlighting their phases, entities, relationships, and hierarchical structures, and their alignment with established empirical guidelines and open science principles. It further demonstrates how these models facilitate experimental planning, execution, analysis, interpretation, dissemination, and educational use while enabling interoperability, provenance tracking, and forensic readiness. By extending these contributions through a set of prospective actions, the chapter positions conceptual modeling as a unifying scientific instrument that can drive cumulative, reproducible, and cross-domain empirical research in computing disciplines.