Posts

  • wasmer-php

    🐘🕸️ WebAssembly runtime for PHP

    • Easy to use: The wasmer API mimics the standard WebAssembly C API,
    • Fast: wasmer executes the WebAssembly modules as fast as possible, close to native speed,
    • Safe: All calls to WebAssembly will be fast, but more importantly, completely safe and sandboxed.

    Tags: #php • php-extension • rust

  • FakeRest

    Patch fetch/XMLHttpRequest to fake a REST API server in the browser, based on JSON data.

    A browser library that intercepts AJAX calls to mock a REST server based on JSON data.

    Use it in conjunction with MSW, fetch-mock, or Sinon.js to test JavaScript REST clients on the client side (e.g. single page apps) without a server.

    See it in action in the react-admin demo (source code).

    Tags: #typescript • rest • client

  • rspec-snapshot

    RSpec snapshot testing.

    Adds snapshot testing to RSpec, inspired by Jest.

    Tags: #ruby

  • preconstruct

    🎁 Dev and build your code painlessly in monorepos

    Dev and build your code painlessly in monorepos

    Tags: #typescript

  • folio

    Page based routing for Laravel.

    Laravel Folio is a powerful page based router designed to simplify routing in Laravel applications.

    Tags: #php

  • react-native-fcm

    react native module for firebase cloud messaging and local notification

    I’m not longer able to maintain this repo. check react-native-firebase instead

    Tags: #java • notifications • fcm

  • partyline

    Output to Laravel’s console from outside of your Command classes.

    This package allows you to output to the console from outside of command class.

    For example, you might have a feature that does the same thing from a command and through the web. Until now, you may have found yourself duplicating code just to be able to output to the console in various places.

    With Partyline, you can use output commands within your logic. If it’s being run inside the console, you’ll see it. Otherwise, nothing will happen.

    Tags: #php • laravel • laravel-5-package

  • pokemon-essentials

    A heavily modified RPG Maker XP game project that makes the game play like a Pokémon game. Not a full project in itself; this repo is to be added into an existing RMXP game project.

    1. Fork this repo.
    2. Get a copy of Essentials v21.1 (a download link cannot be provided here).
    3. Clone your forked repo into the Essentials v21.1 folder, replacing the existing files with the ones from the repo.

    From here, you can edit this project to turn it into your fangame/develop mods. When this repo is updated, you can pull the changes to update your fork and get the updates into your fangame/modding environment.

    Tags: #ruby • pokemon • rpgmakerxp

  • capistrano-db-tasks

    Add capistrano tasks for syncing remote and local database

    Add database AND assets tasks to capistrano to a Rails project. It only works with capistrano 3. Older versions until 0.3 works with capistrano 2.

    Currently

    • It only supports mysql and postgresql (both side remote and local)
    • Synchronize assets remote to local and local to remote

    Commands mysql, mysqldump (or pg_dump, psql), bzip2 and unbzip2 (or gzip) must be in your PATH

    Feel free to fork and to add more database support or new tasks.

    Tags: #ruby

  • aitextgen

    A robust Python tool for text-based AI training and generation using GPT-2.

    A robust Python tool for text-based AI training and generation using OpenAI’s GPT-2 and EleutherAI’s GPT Neo/GPT-3 architecture.

    aitextgen is a Python package that leverages PyTorch, Hugging Face Transformers and pytorch-lightning with specific optimizations for text generation using GPT-2, plus many added features. It is the successor to textgenrnn and gpt-2-simple, taking the best of both packages:

    • Finetunes on a pretrained 124M/355M/774M GPT-2 model from OpenAI or a 125M/350M GPT Neo model from EleutherAI…or create your own GPT-2/GPT Neo model + tokenizer and train from scratch!
    • Generates text faster than gpt-2-simple and with better memory efficiency!
    • With Transformers, aitextgen preserves compatibility with the base package, allowing you to use the model for other NLP tasks, download custom GPT-2 models from the HuggingFace model repository, and upload your own models! Also, it uses the included generate() function to allow a massive amount of control over the generated text.
    • With pytorch-lightning, aitextgen trains models not just on CPUs and GPUs, but also multiple GPUs and (eventually) TPUs! It also includes a pretty training progress bar, with the ability to add optional loggers.
    • The input dataset is its own object, allowing you to not only easily encode megabytes of data in seconds, cache, and compress it on a local computer before transporting to a remote server, but you are able to merge datasets without biasing the resulting dataset, or cross-train on multiple datasets to create blended output.

    You can read more about aitextgen in the documentation!

    Tags: #python

subscribe via RSS