Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error in generate_report: 'content' #1052

Open
hherb opened this issue Dec 29, 2024 · 5 comments
Open

Error in generate_report: 'content' #1052

hherb opened this issue Dec 29, 2024 · 5 comments

Comments

@hherb
Copy link

hherb commented Dec 29, 2024

Describe the bug
Since updating to version 0.10.9 (same with 0.10.10) I get an error "Error in generate_report: 'content'" after successful web scraping. This only happens when I use a custom log handler - the moment I comment out the custom logs handler the report gets generated as usual

This happens irrespective of LLM used (OpenAI default, ollama various ...) and regardless of type of research.

How can the log handler afect report generation that way?

To Reproduce
this is the relevant part of my python app:

class CustomLogsHandler:
    """A custom Logs handler class to handle JSON data."""
    def __init__(self):
        self.logs = []  # Initialize logs to store data

    def set_widget(self, widget):
        self.widget = widget

    async def send_json(self, data: Dict[str, Any]) -> None:
        """Send JSON data and log it."""
        
        self.logs.append(data)  # Append data to logs
        output = data['output']
        if starts_with_emoji(output):
            output = f"\n{output}"
        self.widget.stream(output)
        if data['metadata']:
            for item in data['metadata']:
                self.widget.stream(item)


async def do_research(query: str, report_type: str = "research_report", widget=None) -> str:
    research_settings()
    custom_logs_handler = CustomLogsHandler()
    researcher = GPTResearcher(query=query, 
                                report_type=report_type, 
                                config_path=None, 
                                websocket=custom_logs_handler,
                                )
    researcher.set_verbose(True)

    # Initialize the researcher
    with widget.add_step(status="running", running_title="Researching ...", success_title="job done!") as chatstep:    
        custom_logs_handler.set_widget(chatstep)
        await researcher.conduct_research()
    report = await researcher.write_report()
    return(report)

Log excerpt:
INFO: [07:42:51] ✍️ Writing report for 'aspartame consumption and diabetes risk'...
{'content': 'writing_report',
'metadata': None,
'output': "✍️ Writing report for 'aspartame consumption and diabetes risk'...",
'type': 'logs'}
{'output': '# Aspartame Consumption and Diabetes Risk: A Comprehensive '
'Analysis\n'
'\n',
'type': 'report'}
Error in generate_report: 'content'
INFO: [07:42:53] 📝 Report written for 'aspartame consumption and diabetes risk'
{'content': 'report_written',
'metadata': None,
'output': "📝 Report written for 'aspartame consumption and diabetes risk'",
'type': 'logs'}

Expected behavior
Previously, the report would be generated without problems

@assafelovic
Copy link
Owner

@kga245 any chance you can help with this?

@hherb
Copy link
Author

hherb commented Dec 30, 2024 via email

@kga245
Copy link
Contributor

kga245 commented Jan 8, 2025

Hi, @hherb . @assafelovic brought me into the loop because logging is an area I have recently worked on improving.

I have stabilized logging more recently, but I haven't experimented with custom logging. I'd love to know what you're trying to get. So let's start there if you you don't mind.

Here's what I can tell from the code you supplied:

The "Something went wrong!" error you are seeing is likely because:

  • The custom handler isn't properly integrated with the frontend websocket system
  • The backend isn't receiving or processing the websocket messages correctly
  • There's a version mismatch between their frontend dependencies and what's expected

Alternatively, if you package up a branch and I am happy to see if I can diagnose locally.

@hherb
Copy link
Author

hherb commented Jan 8, 2025 via email

@kga245
Copy link
Contributor

kga245 commented Jan 8, 2025

I'll have a go at this. You save lives.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants