Development

From Things and Stuff Wiki
Jump to navigation Jump to search


General

See IDE, Editor, Debugging, Computing, Computer, Programming, Organisation





https://news.ycombinator.com/item?id=26048784






  • https://en.wikipedia.org/wiki/Functional_specification - (also, functional spec, specs, functional specifications document (FSD), functional requirements specification, or Program specification) in systems engineering and software development is the documentation that describes the requested behavior of an engineering system. The documentation typically describes what is needed by the system user as well as requested properties of inputs and outputs (e.g. of the software system). A functional specification is the more technical response onto a matching requirements document, e.g. the Product Requirement Document "PRD". Thus it picks up the results of the requirements analysis stage. On more complex systems multiple levels of functional specifications will typically nest to each other, e.g. on the system level, on the module level and on the level of technical details.




People




Learning


  • CodeTriage - You want to contribute to Open Source, great! But how do you get started? CodeTriage helps by picking a handful of open issues and delivering them directly to your inbox. After you sign up for CodeTriage, you pick the repos you want to help with, and we periodically send you issues. If you get busy we have an algorithm that helps to back off the issue load so that you don't get overwhelmed.


Tools

See also IDE, Typography#Monospace, WebDev#Authoring, Vim, Git, JS tools, *nix, etc.



to sort






  • Formal Methods of Software Design - using mathematics to write error-free programs. The mathematics needed is not complicated; it's just boolean algebra. The word "formal" means the use of a formal language, so that the program logic can be machine checked. Our compilers already tell us if we make a syntax error, or a type error, and they tell us what and where the error is. Formal methods take the next step, telling us if we make a logic error, and they tell us what and where the error is. And they tell us this as we make the error, not after the program is finished. It is good to get any program correct while writing it, rather than waiting for bug reports from users. It is absolutely essential for programs that lives will depend on.



  1. Do you use source control?
  2. Can you make a build in one step?
  3. Do you make daily builds?
  4. Do you have a bug database?
  5. Do you fix bugs before writing new code?
  6. Do you have an up-to-date schedule?
  7. Do you have a spec?
  8. Do programmers have quiet working conditions?
  9. Do you use the best tools money can buy?
  10. Do you have testers?
  11. Do new candidates write code during their interview?
  12. Do you do hallway usability testing?

tef:

  1. "Programmers who know they will make mistakes
  2. "Programmers who think they will not make mistakes"


  • Methods & Tools - Software development magazine: software testing, project management, Agile, Scrum, UML, programming, requirements



  • https://github.com/tree-sitter/tree-sitter - a parser generator tool and an incremental parsing library. It can build a concrete syntax tree for a source file and efficiently update the syntax tree as the source file is edited. Tree-sitter aims to be: General enough to parse any programming language; Fast enough to parse on every keystroke in a text editor; Robust enough to provide useful results even in the presence of syntax errors; Dependency-free so that the runtime library (which is written in pure C, can be embedded in any application.

Code beautifier

  • https://en.wikipedia.org/wiki/Prettyprint - or pretty-print is the application of any of various stylistic formatting conventions to text files, such as source code, markup, and similar kinds of content. These formatting conventions can adjust positioning and spacing (indent style), add color and contrast (syntax highlighting), adjust size, and make similar modifications intended to make the content easier for people to view, read, and understand. Prettyprinters for programming language source code are sometimes called code beautifiers.


Autocompletion

  • TabNine - the all-language autocompleter. It uses machine learning to provide responsive, reliable, and relevant suggestions.Traditional autocompleters suggest one word at a time. Why accept this limitation?

Tags

CTags

  • https://en.wikipedia.org/wiki/Ctags - generates an index (or tag) file of language objects found in source files that allows these items to be quickly and easily located by a text editor or other utility. A tag signifies a language object for which an index entry is available (or, alternatively, the index entry created for that object).



GNU GLOBAL

  • GNU GLOBAL - source code tagging system that works the same way across diverse environments, such as Emacs editor, Vi editor, Less viewer, Bash shell, various web browsers, etc. You can locate various objects, such as functions, macros, structs, classes, in your source files and move there easily. It is useful for hacking on large projects which contain many sub-directories, many #ifdef and many main() functions. It is similar to ctags or etags, but is different from them in the following two points: independence of any editor capability to treat definition and referenceIt runs in UNIX (POSIX) compatible operating system, like GNU and BSD.


Language server

  • Langserver.org - a community-driven site, maintained by Sourcegraph, to track development progress of LSP-compatible language servers and clients.


  • https://en.wikipedia.org/wiki/Language_Server_Protocol - an open, JSON-RPC-based protocol for use between source code editors or integrated development environments (IDEs) and servers that provide programming language-specific features. The goal of the protocol is to allow programming language support to be implemented and distributed independently of any given editor or IDE.


Static analysis

See also Debugging


  • https://en.wikipedia.org/wiki/Static_program_analysis - the analysis of computer software that is performed without actually executing programs, in contrast with dynamic analysis, which is analysis performed on programs while they are executing. In most cases the analysis is performed on some version of the source code, and in the other cases, some form of the object code.The term is usually applied to the analysis performed by an automated tool, with human analysis being called program understanding, program comprehension, or code review. Software inspections and software walkthroughs are also used in the latter case.




  • Static Program Analysis - Static program analysis is the art of reasoning about the behavior of computer programs without actually running them. This is useful not only in optimizing compilers for producing efficient code but also for automatic error detection and other tools that can help programmers. As known from Turing and Rice, all interesting properties of the behavior of programs written in common programming languages are mathematically undecidable. This means that automated reasoning of software generally must involve approximation. It is also well known that testing may reveal errors but not show their absence. In contrast, static program analysis can - with the right kind of approximations - check all possible executions of the programs and provide guarantees about their properties. The challenge when developing such analyses is how to ensure high precision and efficiency to be practically useful. This teaching material concisely presents the essential principles and algorithms for static program analysis. We emphasize a constraint-based approach where suitable constraint systems conceptually divide analysis into a front-end that generates constraints from program code and a back-end that solves the constraints to produce the analysis results. The style of presentation is intended to be precise but not overly formal. The readers are assumed to be familiar with advanced programming language concepts and the basics of compiler construction. The concepts are explained using a tiny imperative programming language, TIP, which suffices to illustrate the main challenges that arise with mainstream languages. The lecture notes, slides, implementation, and exercises have been developed since 2008 for our graduate-level course at Aarhus University. We continue to update the material regularly. Suggestions for improvements are welcome!




  • SARIF Home - is an industry standard format for the output of static analysis tools.


  • https://github.com/microsoft/sarif-tutorials - the Static Analysis Results Interchange Format, defines a standard format for the output of static analysis tools. It is a powerful and sophisticated format suited to the needs of a wide variety of tools. For this reason — and because the format is defined in a 220-plus page specification written in formal language! — it can be hard to learn SARIF and to figure out what parts of it you need to use. These tutorials aim to present SARIF in a more approachable way. We'll start with some background: Why do we need SARIF? Where did it come from? What can it do? Then we'll dive into the format, exploring the most basic concepts first, then moving on to more advanced concepts.



  • https://en.wikipedia.org/wiki/Shape_analysis_(program_analysis) - a static code analysis technique that discovers and verifies properties of linked, dynamically allocated data structures in (usually imperative) computer programs. It is typically used at compile time to find software bugs or to verify high-level correctness properties of programs. In Java programs, it can be used to ensure that a sort method correctly sorts a list. For C programs, it might look for places where a block of memory is not properly freed.




  • Cppcheck - a static analysis tool for C/C++ code. It provides unique code analysis to detect bugs and focuses on detecting undefined behaviour and dangerous coding constructs. The goal is to have very few false positives. Cppcheck is designed to be able to analyze your C/C++ code even if it has non-standard syntax (common in embedded projects).

Methodologies

See also UI, Organisation


  • https://en.wikipedia.org/wiki/Application_lifecycle_management - the product lifecycle management (governance, development, and maintenance) of computer programs. It encompasses requirements management, software architecture, computer programming, software testing, software maintenance, change management, continuous integration, project management, and release management



FLOSS

See also Free/open


  • https://en.wikipedia.org/wiki/Open-source_software_development - the process by which open-source software, or similar software whose source code is publicly available, is developed. These are software products available with its source code under an open-source license to study, change, and improve its design. Examples of some popular open-source software products are Mozilla Firefox, Google Chromium, Android, LibreOffice and the VLC media player. Open-source software development has been a large part of the creation of the World Wide Web as we know it, with Tim Berners-Lee contributing his HTML code development as the original platform upon which the internet is now built.


  • https://en.wikipedia.org/wiki/Open-source_model - a decentralized software-development model that encourages open collaboration. A main principle of open-source software development is peer production, with products such as source code, blueprints, and documentation freely available to the public. The open-source movement in software began as a response to the limitations of proprietary code. The model is used for projects such as in open-source appropriate technology, and open-source drug discovery.

Open source promotes universal access via an open-source or free license to a product's design or blueprint, and universal redistribution of that design or blueprint. Before the phrase open source became widely adopted, developers and producers used a variety of other terms. Open source gained hold with the rise of the Internet. The open-source software movement arose to clarify copyright, licensing, domain, and consumer issues.

Generally, open source refers to a computer program in which the source code is available to the general public for use or modification from its original design. Open-source code is meant to be a collaborative effort, where programmers improve upon the source code and share the changes within the community. Code is released under the terms of a software license. Depending on the license terms, others may then download, modify, and publish their version (fork) back to the community.





States of growth

  • https://en.wikipedia.org/wiki/Stages_of_growth_model - a theoretical model for the growth of information technology (IT) in a business or similar organization. It was developed by Richard L. Nolan during the 1970s, and described by him in the Harvard Business Review.
  • Initiation
  • Contagion
  • Control
  • Integration
  • Data administration
  • Maturity

QMMG

  • https://en.wikipedia.org/wiki/Quality_Management_Maturity_Grid - an organizational maturity matrix conceived by Philip B. Crosby first published in his book Quality is Free in 1979. The QMMG is used by a business or organization as a benchmark of how mature their processes are, and how well they are embedded in their culture, with respect to service or product quality management. The QMMG is credited with being the precursor maturity model for the Capability Maturity Model (CMM) created a decade later that also has five levels of maturity.

The Quality Management Maturity Grid describes 5 maturity levels through which an organization or business will go through:

  • Uncertainty
  • Awakening
  • Enlightenment
  • Wisdom
  • Certainty


OODA loop

  • https://en.wikipedia.org/wiki/OODA_loop - refers to the decision cycle of observe, orient, decide, and act, developed by military strategist and USAF Colonel John Boyd. Boyd applied the concept to the combat operations process, often at the strategic level in military operations. It is now also often applied to understand commercial operations and learning processes. 1976



CMM




- https://en.wikipedia.org/wiki/CMMI_Version_1.3 - 2010, supports agile


V-Model


Extreme programming (XP)



  • https://en.wikipedia.org/wiki/Sprint_(software_development) - Sprints are organized around the ideas of the Extreme Programming discipline of software development. The sprint is directed by the coach, who suggests tasks, tracks their progress and makes sure that no one is stuck. Most of the development happens in pairs. A large open space is often chosen as a venue for efficient communication. Sprints can vary in focus. During some sprints people new to the project are welcomed and get an intensive hands-on introduction pairing with an experienced project member. The first part of such sprints is usually spent getting ready, presenting the tutorials, getting the network setup and ensuring that configuration/source control software and processes are installed and followed. A significant benefit of sprinting is that the project members meet in person, socialize, and start to communicate more effectively than when working together remotely.


UP / RUP / AUP

  • https://en.wikipedia.org/wiki/Unified_Process - or Unified Software Development Process, is a popular iterative and incremental software development process framework. The best-known and extensively documented refinement of the Unified Process is the Rational Unified Process (RUP). Other examples are OpenUP and Agile Unified Process.


  • https://en.wikipedia.org/wiki/Rational_Unified_Process - an iterative software development process framework created by the Rational Software Corporation, a division of IBM since 2003. RUP is not a single concrete prescriptive process, but rather an adaptable process framework, intended to be tailored by the development organizations and software project teams that will select the elements of the process that are appropriate for their needs. RUP is a specific implementation of the Unified Process.


Agile

"We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

That is, while there is value in the items on the right, we value the items on the left more."






Services;


Scrum



  • Taiga.io - a project management platform for agile developers & designers and project managers who want a beautiful tool that makes work truly enjoyable.


DSDM


DevOps


TDD

See Testing



BDD

  • Behavior Driven Development - BDD evolved by incorporating what works from Test Driven Development, Agile User Stories, Domain Driven Design and XP with an emphasis on product behavior testing over unit testing. Project stakeholders and team members focus on the problem domain and develop a common language for expressing a product's desired behavior as stories and acceptance test criteria. Developers can then map the stories and criteria on their test code to verify application behavior and report results in the same common language.
  • easyb is a behavior driven development framework for the Java platform. By using a specification based Domain Specific Language, easyb aims to enable executable, yet readable documentation.

DDD


  • https://github.com/wwerner/event-storming-cheatsheet - a rapid design technique for software systems involving both technical staff and domain experts / business analysts. It best fits in a Domain Driven Design context and leans towards / prepares for Event Sourcing and CQRS. The technique was first introduced by Alberto Brandolini and picked up by Vaughn Vernon in Domain Driven Design Distilled. It is also taught as part of his iDDD Workshop series.


ACC

  • Test Analytics is a Google web application that allows rapid generation of a project's ACC model -- an alterative to a test plan that is faster to create and of more practical value. This decomposition of the product allows a easy way to visualize project risk across project capabilities. In addition, Test Analytics supports importing quality signals -- tests, code changes, and bugs -- to quantify risk and map it across your project's model. This gives a bird's eye view of the risk associated with all areas of your project, and a way to assess what portions of your project need additional testing.

ACC consists of three different parts that define your system under test: Attributes, Components, and Capabilities. An easy way to think of each of these elements is by relating them to a part of speech relating to your project.

  • Attributes (adjectives of the system) are qualities and characteristics that promote the product and distinguish it from the competition; examples are "Fast", "Secure", "Stable", and "Elegant". A product manager could have a hand in narrowing down the list of Attributes for the system.
  • Components (nouns of the system) are building blocks that together constitute the system in question. Some examples of Components are "Firmware", "Printing", and "File System" for an operating system project, or "Database", "Cart", and "Product Browser" for an online shopping site.
  • Capabilities (verbs of the system) describe the abilities of a particular Component in order to satisfy the Attributes of the system. An example Capability for a shopping site could be "Processes monetary transactions using HTTPS". You can see that this could be a Capability of the "Cart" component when trying to meet the "Secure" Attribute. The most important aspect of Capabilities is that they are testable.

Further


  • https://en.wikipedia.org/wiki/Acceptance_test-driven_development - a development methodology based on communication between the business customers, the developers, and the testers. ATDD encompasses many of the same practices as Specification by Example, Behavior Driven Development (BDD), Example-Driven Development (EDD), and Story Test-Driven Development (SDD). All these processes aid developers and testers in understanding the customer’s needs prior to implementation and allow customers to be able to converse in their own domain language. ATDD is closely related to Test-Driven Development. It differs by the emphasis on developer-tester-business customer collaboration. ATDD encompasses acceptance testing, but highlights writing acceptance tests before developers begin coding.


to sort

See also UI

  • https://en.wikipedia.org/wiki/Activity-centered_design - an extension of the Human-centered design paradigm in interaction design. ACD features heavier emphasis on the activities that a user would perform with a given piece of technology. ACD has its theoretical underpinnings in activity theory, from which activities can be defined as actions taken by a user to achieve a goal.

When working with activity-centered design, the designers use research to get insights of the users. Observations and interviews are typical approaches to learn more about the users' behavior. By mapping users' activities and tasks, the designer may notice missing tasks for the activity to become more easy to perform, and thus design solutions to accomplish those tasks.



Creative process - what when one participant doesn't want to use the the same creative process assisting tool workflow, i.e., does not want to use a long term backlog (icebox) for ideas, seeing that as stress inducing? Hiding the backlog is one half-answer, but then they would have to view the backlog to check that their new idea hasn't already been started on by others.



  • https://en.wikipedia.org/wiki/Interface_segregation_principle - states that no code should be forced to depend on methods it does not use. ISP splits interfaces that are very large into smaller and more specific ones so that clients will only have to know about the methods that are of interest to them. Such shrunken interfaces are also called role interfaces. ISP is intended to keep a system decoupled and thus easier to refactor, change, and redeploy. ISP is one of the five SOLID principles of object-oriented design, similar to the High Cohesion Principle of GRASP. Beyond object-oriented design, ISP is also a key principle in the design of distributed systems in general and microservices in particular. ISP is one of the six IDEALS principles for microservice design.

Modelling

See also Dataflow

to unconflate

  • https://en.wikipedia.org/wiki/Structure_chart - in software engineering and organizational theory is a chart which shows the breakdown of a system to its lowest manageable levels. They are used in structured programming to arrange program modules into a tree. Each module is represented by a box, which contains the module's name. The tree structure visualizes the relationships between modules.


  • https://en.wikipedia.org/wiki/Function_model - or functional model is a structured representation of the functions (activities, actions, processes, operations) within the modeled system or subject area. A function model, similar with the activity model or process model, is a graphical representation of an enterprise's function within a defined scope. The purposes of the function model are to describe the functions and processes, assist with discovery of information needs, help identify opportunities, and establish a basis for determining product and service costs.


  • https://en.wikipedia.org/wiki/Modeling_language - any artificial language that can be used to express information or knowledge or systems in a structure that is defined by a consistent set of rules. The rules are used for interpretation of the meaning of components in the structure.


  • https://en.wikipedia.org/wiki/Flow_diagram - a collective term for a diagram representing a flow or set of dynamic relationships in a system. The term flow diagram is also used as a synonym for flowchart, and sometimes as a counterpart of the flowchart.[2]Flow diagrams are used to structure and order a complex system, or to reveal the underlying structure of the elements and their interaction.


  • https://en.wikipedia.org/wiki/Data-flow_diagram - a way of representing a flow of a data of a process or a system (usually an information system). The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram has no control flow, there are no decision rules and no loops. Specific operations based on the data can be represented by a flowchart.


  • https://en.wikipedia.org/wiki/Control-flow_diagram - a diagram to describe the control flow of a business process, process or review. Control-flow diagrams were developed in the 1950s, and are widely used in multiple engineering disciplines. They are one of the classic business process modeling methodologies, along with flow charts, drakon-charts, data flow diagrams, functional flow block diagram, Gantt charts, PERT diagrams, and IDEF.


  • https://en.wikipedia.org/wiki/Control-flow_graph - a control-flow graph (CFG) is a representation, using graph notation, of all paths that might be traversed through a program during its execution. The control-flow graph is due to Frances E. Allen,[1] who notes that Reese T. Prosser used boolean connectivity matrices for flow analysis before.[2]The CFG is essential to many compiler optimizations and static-analysis tools.


  • https://en.wikipedia.org/wiki/Object-oriented_modeling - an approach to modeling an application that is used at the beginning of the software life cycle when using an object-oriented approach to software development. The software life cycle is typically divided up into stages going from abstract descriptions of the problem to designs then to code and testing and finally to deployment. Modeling is done at the beginning of the process.
  • https://en.wikipedia.org/wiki/Object-oriented_analysis_and_design - a popular technical approach for analyzing and designing an application, system, or business by applying object-oriented programming, as well as using visual modeling throughout the development life cycles to foster better stakeholder communication and product quality. According to the popular guide Unified Process, OOAD in modern software engineering is best conducted in an iterative and incremental way. Iteration by iteration, the outputs of OOAD activities, analysis models for OOA and design models for OOD respectively, will be refined and evolve continuously driven by key factors like risks and business value.

In 1994, the Three Amigos of Rational Software started working together to develop the Unified Modeling Language (UML). Later, together with Philippe Kruchten and Walker Royce (eldest son of Winston Royce), they have led a successful mission to merge their own methodologies, OMT, OOSE and Booch method, with various insights and experiences from other industry leaders into the Rational Unified Process (RUP), a comprehensive iterative and incremental process guide and framework for learning industry best practices of software development and project management. Since then, the Unified Process family has become probably the most popular methodology and reference model for object-oriented analysis and design.


  • https://en.wikipedia.org/wiki/User_interface_modeling - a development technique used by computer application programmers. Today's user interfaces (UIs) are complex software components, which play an essential role in the usability of an application. The development of UIs requires therefore, not only guidelines and best practice reports, but also a development process including the elaboration of visual models and a standardized notation for this visualization.


USL

  • https://en.wikipedia.org/wiki/Universal_Systems_Language - a modeling language and formal method for the specification and design of software and other complex systems. It was designed by Margaret Hamilton based on her experiences writing flight software for the Apollo program. The language is implemented through the 001 Tool Suite software by Hamilton Technologies, Inc. USL evolved from 001AXES which in turn evolved from AXES all of which are based on Hamilton's axioms of control. The 001 Tool Suite uses the preventative concept of Development Before the Fact (DBTF) for its life-cycle development process. DBTF eliminates errors as early as possible during the development process removing the need to look for errors after-the-fact.


UML

  • https://en.wikipedia.org/wiki/Unified_Modeling_Language - a general-purpose, developmental, modeling language in the field of software engineering, that is intended to provide a standard way to visualize the design of a system. The creation of UML was originally motivated by the desire to standardize the disparate notational systems and approaches to software design. It was developed by Grady Booch, Ivar Jacobson and James Rumbaugh at Rational Software in 1994–1995, with further development led by them through 1996. In 1997 UML was adopted as a standard by the Object Management Group (OMG), and has been managed by this organization ever since. In 2005 UML was also published by the International Organization for Standardization (ISO) as an approved ISO standard. Since then the standard has been periodically revised to cover the latest revision of UML.


  • https://en.wikipedia.org/wiki/Activity_diagram - graphical representations of workflows of stepwise activities and actions[1] with support for choice, iteration and concurrency. In the Unified Modeling Language, activity diagrams are intended to model both computational and organizational processes (i.e., workflows), as well as the data flows intersecting with the related activities. Although activity diagrams primarily show the overall flow of control, they can also include elements showing the flow of data between activities through one or more data stores.





UMLet / UMLetino

  • UMLet - a free, open-source UML tool with a simple user interface: draw UML diagrams fast, build sequence and activity diagrams from plain text, export diagrams to eps, pdf, jpg, svg, and clipboard, share diagrams using Eclipse, and create new, custom UML elements. UMLet runs stand-alone or as Eclipse plug-in on Windows, OS X and Linux.
  • UMLetino - a free online UML tool for fast UML diagrams. It runs in your browser, and does not require any installs. It is based on UMLet (which is available as stand-alone tool or Eclipse plugin), and shares its fast, text-based way of drawing UML sketches. Main features: install-free web app; save diagrams in browser storage; support for many UML diagram types; simple, markup-based UML element modifications; png export.


PlantUML

  • PlantUML - Open-source tool that uses simple textual descriptions to draw beautiful UML diagrams. [26]

BOUML

  • BOUML - a free UML tool box - a free UML 2 tool box including a modeler allowing you to specify and generate code in C++, Java, Idl, Php, Python and MySQL. Since the release 7.0 BOUML is again a free software. BOUML runs under Windows, Linux and MacOS X. BOUML is very fast and doesn't require much memory to manage several thousands of classes, see benchmark. BOUML is extensible, and the external tools named plug-outs can be written in C++ or Java, using BOUML for their definition as any other program. The code generators, reverses and roundtrips are ones of the pre-defined plug-outs included in the BOUML distribution.


Umbrello

  • Umbrello - a Unified Modelling Language (UML) diagram program based on KDE Technology. UML allows you to create diagrams of software and other systems in a standard format to document or design the structure of your programs.


ArgoUML

  • ArgoUML - the leading open source UML modeling tool and includes support for all standard UML 1.4 diagrams. It runs on any Java platform and is available in ten languages.


Violet

  • Violet - a UML editor with these benefits: Very easy to learn and use. Draws nice-looking diagrams. Completely free. Cross-platform. Violet is intended for developers, students, teachers, and authors who need to produce simple UML diagrams quickly


gaphor


Clang-uml

IDEF

  • https://en.wikipedia.org/wiki/IDEF - initially abbreviation of ICAM Definition, renamed in 1999 as Integration DEFinition, refers to a family of modeling languages in the field of systems and software engineering. They cover a wide range of uses, from functional modeling to data, simulation, object-oriented analysis/design and knowledge acquisition. These "definition languages" were developed under funding from U.S. Air Force and although still most commonly used by them, as well as other military and United States Department of Defense (DoD) agencies, are in the public domain. The most-widely recognized and used components of the IDEF family are IDEF0, a functional modeling language building on SADT, and IDEF1X, which addresses information models and database design issues.

Jackson structured programming

  • https://en.wikipedia.org/wiki/Jackson_structured_programming - a method for structured programming based on correspondences between data stream structure and program structure. JSP structures programs and data in terms of sequences, iterations and selections, and as a consequence it is applied when designing a program's detailed control structure. The method applies to processing of any data structure or data stream that is describable as a hierarchical structure of sequential, optional and iterated elements. This could be a stream of messages that a process reads to invoke and coordinate other modules or objects, or it could be a string of characters in parameters passed to a single operation on an "object" coded in an object-oriented programming language. In other words, it could be either above or below the level where object-oriented methods are applied




Statecharts



Misc. software

UML, BPMN, MDA, SysML, ...


Web



Documentation

  • https://en.wikipedia.org/wiki/Software_documentation - written text or illustration that accompanies computer software or is embedded in the source code. The documentation either explains how the software operates or how to use it, and may mean different things to people in different roles

Documentation is an important part of software engineering. Types of documentation include:

  • Requirements – Statements that identify attributes, capabilities, characteristics, or qualities of a system. This is the foundation for what will be or has been implemented.
  • Architecture/Design – Overview of software. Includes relations to an environment and construction principles to be used in design of software components.
  • Technical – Documentation of code, algorithms, interfaces, and APIs.
  • End user – Manuals for the end-user, system administrators and support staff.
  • Marketing – How to market the product and analysis of the market demand.



  • https://en.wikipedia.org/wiki/Documentation_generator - a programming tool that generates software documentation intended for programmers (API documentation, or end users (end-user guide), or both, from a set of source code files, and in some cases, binary files. Some generators, such as Javadoc, can use special comments to drive the generation. Doxygen is an example of a generator that can use all of these methods.
  • https://en.wikipedia.org/wiki/Comparison_of_documentation_generators - general and technical information for a number of documentation generators. Please see the individual products' articles for further information. Unless otherwise specified in footnotes, comparisons are based on the stable versions without any add-ons, extensions or external programs. Note that many of the generators listed are no longer maintained.


Doxygen

  • Doxygen - the de facto standard tool for generating documentation from annotated C++ sources, but it also supports other popular programming languages such as C, Objective-C, C#, PHP, Java, Python, IDL (Corba, Microsoft, and UNO/OpenOffice flavors), Fortran, and to some extent D. Doxygen also supports the hardware description language VHDL.



  • https://github.com/jitsuCM/doxygraph - a free system for reverse engineering UML class diagrams from source code, and presenting those diagrams as interactive web apps. It relies on Doxygen for parsing source code, so it supports all the languages that Doxygen supports: C, C++, C#, Objective C, Java, Python, PHP, Tcl, D, IDL,nVHDL, and Fortran.


Texinfo

  • https://en.wikipedia.org/wiki/Texinfo - a typesetting syntax used for generating documentation in both on-line and printed form (creating filetypes as .mw-parser-output .monospaced{font-family:monospace,monospace}dvi, html, pdf, etc., and its own hypertext format, info) with a single source file. It is implemented by a computer program released as free software of the same name, created and made available by the GNU Project from the Free Software Foundation. The main purpose of Texinfo is to provide a way to easily typeset software manuals. Similar to the LaTeX syntax, all the normal features of a book, such as chapters, sections, cross references, tables and indices are available for use in documents. Using the various output generators that are available for Texinfo, it is possible to keep several documentation types up-to-date (such as on-line documentation provided via a Web site, and printed documentation, as generated using the TeX typesetting system) using only a single source file.


  • Texinfo - GNU Documentation System - GNU Project - Free Software Foundation (FSF) - the official documentation format of the GNU project. It is used by many non-GNU projects as well. Texinfo uses a single source file to produce output in a number of formats, both online and printed (HTML, PDF, DVI, Info, DocBook, LaTeX, EPUB 3). This means that instead of writing different documents for online information and another for a printed manual, you need write only one document. The Texinfo system is well-integrated with GNU Emacs.
  • Top (GNU Texinfo 7.0.3) - a documentation system that can produce both online information and a printed manual from a single source using semantic markup.


  • https://en.wikipedia.org/wiki/Info_(Unix) - a software utility which forms a hypertextual, multipage documentation and help viewer working on a command-line interface. Info reads info files generated by the texinfo program and presents the documentation as a tree with simple commands to traverse the tree and to follow cross references. For instance, pressing the space bar scrolls down within the current tree node or goes to the next node in the current document if already at the bottom of the current node, allowing to read the contents of an info file sequentially. Pressing the backspace key moves in the opposite direction.


  • Info Format Specification (GNU Texinfo 7.0.3) - This format definition was written some 25 years after the Info format was first devised. So in the event of conflicts between this definition and actual practice, practice wins. It also assumes some general knowledge of Texinfo; it is meant to be a guide for implementors rather than a rigid technical standard. We may refer back to other parts of this manual for examples and definitions, rather than redundantly spelling out every detail.



Translation

  • https://en.wikipedia.org/wiki/Internationalization_and_localization - are means of adapting computer software to different languages, regional differences and technical requirements of a target locale. Internationalization is the process of designing a software application so that it can be adapted to various languages and regions without engineering changes. Localization is the process of adapting internationalized software for a specific region or language by adding locale-specific components and translating text. Localization (which is potentially performed multiple times, for different locales) uses the infrastructure or flexibility provided by internationalization (which is ideally performed only once, or as an integral part of ongoing development).



gettext

Weblate

  • Weblate - a free web-based translation tool with tight version control integration. It features simple and clean user interface, propagation of translations across components, quality checks and automatic linking to source files.

Release and deployment









  • Software Package Data Exchange (SPDX)
    • https://en.wikipedia.org/wiki/Software_Package_Data_Exchange - file format used to document information on the software licenses under which a given piece of computer software is distributed. SPDX is authored by the SPDX Working Group, which represents more than twenty different organizations, under the auspices of the Linux Foundation.SPDX attempts to standardize the way in which organizations publish their metadata on software licenses and components in bills of material.




  • https://en.wikipedia.org/wiki/Build_automation - the act of scripting or automating a wide variety of tasks that software developers do in their day-to-day activities including things like: compiling computer source code into binary code, packaging binary code, running tests

deploying to production systems, creating documentation and/or release notes




Continuous process

  • https://en.wikipedia.org/wiki/Continuous_design - a software development practice of creating and modifying the design of a system as it is developed, rather than specifying the system completely before development starts, (as in the waterfall model) or in bursts at the beginning of each iteration (as in the iterative model). Also called "evolutionary design" or "incremental design", continuous design was popularized by extreme programming. Continuous design also uses test driven development and refactoring.


  • https://en.wikipedia.org/wiki/Continuous_integration - the practice, in software engineering, of merging all developer working copies with a shared mainline several times a day. It was first named and proposed by Grady Booch in his method, who did not advocate integrating several times a day however. It was adopted as part of extreme programming (XP), which did advocate multiple integrations a day, perhaps as many as tens a day. The main aim of CI is to prevent integration problems, referred to as "integration hell" in early descriptions of XP. CI isn't universally accepted as an improvement over frequent integration, so it is important to distinguish between the two as there is disagreement about the virtues of each

"Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily - leading to multiple integrations per day. Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible. Many teams find that this approach leads to significantly reduced integration problems and allows a team to develop cohesive software more rapidly."


  • https://en.wikipedia.org/wiki/Continuous_delivery - a design practice used in software development to automate and improve the process of software delivery. Techniques such as automated testing, continuous integration and continuous deployment allow software to be developed to a high standard and easily packaged and deployed to test environments, resulting in the ability to rapidly, reliably and repeatedly push out enhancements and bug fixes to customers at low risk and with minimal manual overhead. The technique was one of the assumptions of extreme programming but at an enterprise level has developed into a discipline of its own, with job descriptions for roles such as "buildmaster" calling for CD skills as mandatory.



  • https://github.com/nbedos/citop - A UNIX program to monitor Continuous Integration pipelines from the command line. citop stands for Continous Integration Table Of Pipelines.


Jenkins

StriderCD

  • Strider is an Open Source Continuous Deployment / Continuous Integration platform. It is written in Node.JS / JavaScript and uses MongoDB as a backing store. BSD license. A focus on Continuous Deployment rather than just Continuous Integration: Designed to be easy to install & setup. Deployable & usable on Heroku free plan. Intended for deployment on private infrastructure. An emphasis on extensibility. Plugins are powerful, easy to write and simple to install. Out-of-the-box support for projects written in Node.JS, Python (generic and Django/Pyramid) and Selenium/Sauce Labs tests. Commercial support, consulting & hosting available

Git based

Travis CI

  • Travis CI - a hosted, distributed continuous integration service used to build and test projects hosted at GitHub. The software is also available as an open source download on GitHub, although its developers do not currently recommend it for on-premise use for closed projects


Testing

See also Testing

  • Errbit - The open source, self-hosted error catcher


Quality





Documentation


  • docopt—language for description of command-line interfaces - Command-line interface description language docopt helps you: define the interface for your command-line app, and automatically generate a parser for it. docopt is based on conventions that have been used for decades in help messages and man pages for describing a program's interface. An interface description in docopt is such a help message, but formalized.



  • StyleDocco - generates documentation and style guide documents from your stylesheets. Stylesheet comments will be parsed through Markdown and displayed in a generated HTML document. You can write HTML code prefixed with 4 spaces or between code fences (```) in your comments, and StyleDocco shows a preview with the styles applied, and displays the example HTML code. The previews are rendered in resizable iframes to make it easy to demonstrate responsive designs at different viewport sizes.



  • Semantic Linefeeds - Instead of fussing with the lines of each paragraph so that they all end near the right margin, they can add linefeeds anywhere that there is a break between ideas. The result can be spectacular.
  • Every line of code is always documented - As it turns out, this line—more specifically, the change which introduced this line—is heavily documented with information about why it was necessary, why did the previous approach (referred to by a commit SHA) not work, which browsers are affected, and a link for further reading. [31]


  • Sphinx - makes it easy to create intelligent and beautiful documentation. Here are some of Sphinx’s major features: Output formats: HTML (including Windows HTML Help), LaTeX (for printable PDF versions), ePub, Texinfo, manual pages, plain text Extensive cross-references: semantic markup and automatic links for functions, classes, citations, glossary terms and similar pieces of information Hierarchical structure: easy definition of a document tree, with automatic links to siblings, parents and children Automatic indices: general index as well as a language-specific module indices Code handling: automatic highlighting using the Pygments highlighter Extensions: automatic testing of code snippets, inclusion of docstrings from Python modules (API docs) via built-in extensions, and much more functionality via third-party extensions. Themes: modify the look and feel of outputs via creating themes, and re-use many third-party themes. Contributed extensions: dozens of extensions contributed by users; most of them installable from PyPI. Sphinx uses the reStructuredText markup language by default, and can read MyST markdown via third-party extensions. Both of these are powerful and straightforward to use, and have functionality for complex documentation and publishing workflows. They both build upon Docutils to parse and write documents.


  • Read the Docs - hosts documentation for the open source community. We support Sphinx docs written with reStructuredText and CommonMark. We pull your code from your Subversion, Bazaar, Git, and Mercurial repositories. Then we build documentation and host it for you. Think of it as Continuous Documentation.



Collaborative