Follow

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use
Contact

Save User Activity in json file

I am trying to save the user activities in a json file but when ever the file size gets bigger and multiple users working on same time the json file deletes the old records.

this is my Trait

trait CustomLogActivity
{
    protected static function bootCustomLogActivity()
    {
        foreach (static::getModelEvents() as $event) {
            static::$event(function ($model) use ($event) {
                $model->recordActivity($event);
            });
        }
    }
    protected static function getModelEvents()
    {
        return ['created', 'updated', 'deleted'];
    }

    protected function recordActivity($event)
    {
        $activity = [
            'user_id' => Auth::id(),
            'type' => $event,
            'subject' => (new \ReflectionClass($this))->getShortName(),
            'timestamp' => now()
        ];

        if ($event === 'updated') {
            $activity['old_properties'] = $this->getOriginal();
            $activity['new_properties'] = $this->getAttributes();
        } else {
            $activity['properties'] = $this->getAttributes();
        }
        $this->appendToLog($activity);
    }

    protected function appendToLog($activity)
    {
        $logFile = 'activity.json';
        $log = json_encode($activity);
        Storage::append($logFile, $log);
    }

    protected function getActivityType($event)
    {
        $type = strtolower((new \ReflectionClass($this))->getShortName());

        return "{$event}_{$type}";
    }
}

MEDevel.com: Open-source for Healthcare and Education

Collecting and validating open-source software for healthcare, education, enterprise, development, medical imaging, medical records, and digital pathology.

Visit Medevel

>Solution :

As I mentioned in some comments, I will post it as an answer so it is explanatory for anyone having these types of issues:

The error you are having is called: concurrency.

I am assuming 2 processes uses the file at the same time, so both reads the current content, but one of them after that writes, the other process already has data in memory (so the new data is not get by this process), but not the new content, so it will overwrite the file…

First of all, use a Queue (events) to send data, and then use Redis, or a database or something that is super fast for this, but not literally a file, you can lose it instantly, but not a database…

You can still use a file bu I would not recommend to do so because it depends a lot on your infrastructure:

  • If you have a load balancer with 10 machines, are you going to have 10 different files (one per machine)?
    • How do you combine them?

So what I would do is just have a queue (triggered by using an event) and let that queue, with a single worker, handle this super specific task. But you will have to have in mind the speed, if you are getting more events in the queue than the single worker can resolve, you will have to find a solution for that

Add a comment

Leave a Reply

Keep Up to Date with the Most Important News

By pressing the Subscribe button, you confirm that you have read and are agreeing to our Privacy Policy and Terms of Use

Discover more from Dev solutions

Subscribe now to keep reading and get access to the full archive.

Continue reading