Just a regular Joe.

  • 0 Posts
  • 168 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle

  • Another technique that helps is to limit the amount of information shared with clients to need to know info. This can be computationally intensive server-side and hard to get right … but it can help in many cases. There are evolving techniques to do this.

    In FPS games, there can also be streaming input validation. eg. Accurate fire requires the right sequence of events and/or is used for cheat detection. At the point where cheats have to emulate human behaviour, with human-like reaction times, the value of cheating drops.

    That’s the advanced stuff. Many games don’t even check whether people are running around out of bounds, flying through the air etc. Known bugs and map exploits don’t get fixed for years.


  • Not everything will be open source. For whatever reason, they decided to make this obfuscator open source. It might also just be an interesting side project that someone got permission to release.

    Obfuscation can make it harder to reverse engineer code, even if the method is known. It might also be designed to be pluggable, allowing custom obfuscation. I haven’t checked.

    We also know that obfuscation isn’t real security … but it’s sometimes it is also good enough for a particular use case…


  • ALSA is lowest level, and is the kernel interface to audio hardware. Pipewire provides a userspace service to share limited hardware.

    Try setting “export PIPEWIRE_LATENCY=2048/48000” before running an audio producing application (from the same shell).

    Distortion can sometimes be related to the audio buffers not getting filled in time, so increasing the buffering as above gives it more time to even out. You can try 1024 instead of 2048 too.

    There is no doubt a way to set it globally, if it helps.

    Good luck!


  • Except my crazy relative (just 1, thank dog) also has telegram and feels the urge to forward every damn whackjob conspiracy theory reinterpretation of truth that they find to me and my wife, despite us never replying except to ask them to stop. eg. Cloud seeding, windmills and electric cars are responsible for destroying the atmosphere (not co2 and other greenhouse gases); Bill Gates etc. are spreading microchips through vaccinations; judges ruling that measles doesn’t exist; Ukraine is full of nazis; and yes, even regurgitated feelgood fairy tales and random cat pictures from Facebook. So glad they are in a country far far away from me. They “do their own research”, of course.

    So bloody sad that so many people are in a similar situation of avoiding friends and family for their own sanity (and sometimes safety).




  • But not Fire tablets (kids profile) or Samsung TV or many others that Plex currently supports.

    JellyFin android phone app’s UI is a little weird at times, but does work pretty well for me.

    What I would adore from any app would be an easy way to upload specific content and metadata via SFTP or to blob storage and accessible with auth (basic, token, or cloud) to more easily share it with friends/family/myself without having to host the whole damn library on the Internet or share my home Internet at inconvenient times.

    Client-side encryption would be a great addition to that (eg. password required, that adds a key to the key ring). And of course native support in the JellyFin/other apps for this. It could even be made to work with a JS & WASM player.


  • In many countries the onus is on the connection owner to point the finger at the next person, or to otherwise prove it wasn’t them / their responsibility. Even if they can, fighting off lawyers who are experts in this area is costly (time & typically money).

    Exactly the same problem exists with a VPN. What makes it personal? It’s just a service you bought, which can be used by multiple people and devices. The source IP typically links it to the user. So … back to the point of picking a trusted VPN provider in a trusted region.

    For civil matters (like copyright infringement in most jurisdictions), a standard VPN (with egress in another jurisdiction) and client-side precautions will be fine, crypto or no crypto. Frankly, it’s quite normal for people to use VPNs these days. My employer even recommends employees use personal VPNs for their personal devices.

    For the despicable shit, espionage etc., onion routing and crypto might be better. The police and agencies have many more tools at their disposal, and any mistake could be one’s undoing.

    A firm dropping crypto is hardly a reason to declare a holy war against a VPN provider. For those who care, they already do.


  • There is near-zero privacy when a VPN has your real IP address and could log connections (source/dest/proto/port). Don’t fool yourself into thinking that using crypto payments adds much, unless you are constructing your own VPN onion OR you are concerned about what shows up on your CC bill directly (eg. to hide it from your family)

    When you use a VPN provider, you are trusting them with your anonymity - that they don’t keep logs of connections that could lead back to your identity, and/or they won’t hand that info over to others (law enforcement, courts, some spook slipping an employee $500).

    It’s also quite likely that your the average VPN user uses the same webbrowser (full of fingerprints and tasty cookies) to identify themselves in 10 different ways as they visit their shady torrent/porn/zuckerberg/fetish/gambling/bezos site, removing another layer of the obscurity onion.





  • And contributions to codebases that have developed with the goal of meeting the team’s own needs, and who similarly don’t have the time or space to refactor or review as needed to enable effective contributions.

    Enabling Innersource will be a priority for management for only two weeks, anyway, before they focus on something else. And if it even makes it into measurable goals, it will probably be gamed so it doesn’t ruin bonuses.

    Do you also work for $GenericMultinationalCompany, per-chance? Do you also know $BillFromFinance?



  • Encryption will typically be CPU bound, while many servers will be I/O bound (eg. File hosting, rather than computing stuff). So it will probably be fine.

    Encryption can help with the case that someone gets physical access to the machine or hard disk. If they can login to the running system (or dump RAM, which is possible with VMs & containers), it won’t bring much value.

    You will of course need to login and mount the encrypted volume after a restart.

    At my work, we want to make sure that secrets are adequately protected at rest, and we follow good hygiene practices like regularly rotating credentials, time limited certificates, etc. We tend to trust AWS KMS to encrypt our data, except for a few special use cases.

    Do you have a particular risk that you are worried about?


  • Joe@discuss.tchncs.detoSelfhosted@lemmy.worldSecrets Management
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    22 days ago

    Normally you wouldn’t need a secrets store on the same server as you need the secrets, as they are often stored unencrypted by the service/app that needs it. An encrypted disk might be better in that case.

    That said, Vault has some useful features like issuing temporary credentials (eg. for access to AWS, DBs, servers) or certificate management. If you have these use-cases, it could be useful, even on the same server.

    At my work, we tend to store deployment-time secrets either in protected Gitlab variables or in Vault. Sometimes we use AWS KMS to encrypt values in config files, which we checkin to git repositories.



  • It typically takes a small core team to build the framework/architecture that enables many others to contribute meaningfully.

    Most OSS projects get bugger all contributions from outside the initial core team, having limited ability to onboard people. The biggest and most active (out of necessity or by design) have a contribution friendly software architecture and process, and often deliberately organized communities (eg. K8S & CNCF) or major corporate sponsors filling the role.

    Free Software and resulting ecosystems seem to have a better chance of contributing to the common good over the long term. This is simply because most companies are beholden to their shareholders, and at some point the urge to squeeze every last cent out of an opportunity comes to the forefront, and many initially well intentioned efforts get poisoned.

    Free Software licenses like the GPL help to protect our freedom and to set open standards, and are essential for the core technology stack.

    When someone can get annoyed with some shitty software or its license-terms and reimplement the core functionality in a few days/weeks/months … eventually someone will get annoyed and create some decent free software that will kill off the shitty alternatives, or even just a better commercial alternative. This only works because of the open platforms & protocols.

    One of the major challenges for consumers is finding good software today in the grey goo of projects and appstores. This harks back to OP’s point about curated collections of software. It’s also where the various foundations add value (CNCF, Linux Foundation, Apache) … along with “awesome X” gitlab repos, which are far better than random youtube videos or ad-riddled blogs or magazine articles.


  • The true strength is in the open interfaces and common protocols that enable competition and choice, followed by the free-to-use libraries that establish a foundation upon which we can build and iterate. This helps us to stay in control of our hardware, our data, and our destiny.

    Practically speaking, there is often more value in releasing something as free software than there is to commercialising it or otherwise tightly controlling the source code… and for these smaller tools and libraries it is especially the case.

    Many bigger projects (eg. linux kernel, firefox, kubernetes, apache*) help set the direction of entire industries, building new opportunities as they go, thanks to the standardization that comes from their popularity.

    It’s also a reason why many companies release software as open source too, especially in the early days, establishing themselves as THE leader…for a while at least (eg. Docker Inc, Hashicorp).


OSZAR »