• sysgen [none/use name,they/them]
    ·
    edit-2
    1 year ago

    Data is most general concept we (can) have for anything that can be known or observed. If it can't be known or observed, it can't have any tangible effect, so you can never make a mechanism that operates on it. It's not a cop-out, it's the exact opposite.

      • sysgen [none/use name,they/them]
        ·
        edit-2
        1 year ago

        I think I see the disconnect here. When I say general, I also mean specific. That is to say, when I say that computers are general, I mean to say that any given, specific, individal computer (including a bird if it was smart enough), given any and every specific task and some amount of memory, can do it if and only if it is possible at all.

        In other words, computers are interchangeable, you only need one and it can do any task (with enough memory)

        This isn't true for a watch or a wrench or a wheel, but it's true for any computer.

        With this definition of general, as you can see, there is nothing vague about the statement at all, and it can't be understood as a cop out.

        Sorry for being unclear, I definitely understand why your interpretation of general in my comment is valid.

          • sysgen [none/use name,they/them]
            ·
            1 year ago

            There are things that computers are inherently incapable of. The problem is that as far as anyone knows, they are things that are simply impossible, as long as we are talking about transforming data. The idea that there are possible things that computers can't do in this wheelhouse is called hypercomputation:

            https://en.wikipedia.org/wiki/Hypercomputation

            The only contender that isn't either mathematically or physically impossible is to take a regular computer and sling it around a black hole with particular attributes, ie, a normal computer.

            I mention this in the comment you're replying to - I say that a computer can do such a task if and only if it is possible.

            The brain isn’t a computer. It doesn’t function off of binary logic as far as we know.

            Physical systems which rely on continuous processes (ie, neither on nor off) are exactly equivalent to discrete processes (which have a finite number of states) are exactly equivalent to binary processes (having exactly two states) as far as computation is concerned.

            This is because continuous physical systems inherently, necessarily have some amount of noise/uncertainty, and there is a rigorous mathematical proof that once this is true, then the two are equivalent.

            It is exceedingly unlikely that there is any operation on data that the human brain can do but that computers can't. There is not an inkling of a scientific hint that this could possibly be true. Of course, you can't prove a negative.

            There are a lot of assumptions here you’re making, and that the majority of tech bros incorrectly make

            The thing I am assuming here is the Church-Turing thesis. It's a thesis that is assumed to be true by every computer scientist and the vast majority of mathematicians who it concerns. If you can show that it's incorrect I'd love to hear it, but no one even has a lead on how it could possibly be untrue. People tried very hard.

              • sysgen [none/use name,they/them]
                ·
                edit-2
                1 year ago

                I mean sure, there are things that can't be computed that are possible. But computing them is an impossible task, is it not? So our computers can realize any possible data-wrangling task. You just gave an example of an impossible data-wrangling task.

                Calling the devices that run our universe that can bypass quantum randomness “computers” would be like comparing a grain of rice to a dildo.

                Our computers can also exhibit true randomness (using quantum processes). So there is no problem there. You can make a quantum-accurate simulation of any event, in that if you re-did the event on either the computer or real life you'd arrive to the same average, ie, it's a perfect simulation of actually doing the event again and observing it.