In the era of "Big Data" and when data science was becoming a more normal role in the organization, I was approached with finding an answer to some business question as the impetus for a new feature to be developed. Considering I didn't have access to real production data and we copied down redacted and masked data from production to our QA environment, I figured that was good enough to do my research. I couldn't have been more wrong. Unsurprisingly, the data in QA was full of holes and problematic things and in a few cases the table schema didn't match our development or production environments.
Naturally this led me to asking if anyone knew what was going on which resulted in, "well this is just QA, who cares if our data is messy?" For some developers, that might be a totally valid take, but if your team relies on that data to correctly perform things like automated testing and ETL processing then that's a scary prospect.
When building an application, I look at it like building a house, especially if this is a green field project. To start, you have to have the land (your market), then you pour the foundation (data, architectural guidelines, etc.), next is the wiring and plumbing (infrastructure and pipelines that keep the lights on), then finally the walls and pretty bits (actual application code).
Similar to how I've seen developers over the years jump straight to coding when given a problem or task, there are many developers and even architects that go straight to the infrastructure or implementation details without ever considering the data that will run the whole thing. It might just be my personal experiences, but when I've asked where the data is being sourced, a lot of the time I'll get answers in the theme of "I assumed it was there already" and I've been curious as to how we got there.
Another problem that I've seen is that over time, the data that we have becomes dirty or outright bad because of other issues within a system. I've rarely come across places where this data is cleaned after an issue is discovered and resolved and this leads to errant issues later on in a system's lifecycle which could have been resolved earlier. Is this a case of new features are promoted over fixing existing issues? I believe it may be.
Anyhow, data is the foundation of a strong system and I've seen it neglected or outright ignored because the assumption is that the foundation is strong. Just like any other part of a building or system though, it needs to be maintained and when maintenance is ignored, it slowly crumbles. As owners of software systems, we have to be aware of each part of our system and take occasional looks at even the more boring and unchanging areas of the system to make sure they're good to go. Not doing so is being neglectful and can have a terrible consequence to your system and business.