diff --git a/docs/content/docs/cli/guides/cli-commands.mdx b/docs/content/docs/cli/guides/cli-commands.mdx index b2d1bcb..23a4a49 100644 --- a/docs/content/docs/cli/guides/cli-commands.mdx +++ b/docs/content/docs/cli/guides/cli-commands.mdx @@ -62,6 +62,18 @@ mk-notes sync -i -d -k +- `--flat`: Flatten the result page tree, making all pages direct children of the destination instead of maintaining nested folder structures. When enabled, the root node is skipped and all files are created as direct children of the parent page. + + + + Use `--flat` when you want a flat list of pages in Notion rather than a hierarchical structure. This is useful for: + + - Creating a simple list of documents without nested folders + - Syncing to databases where you want all items at the same level + - Simplifying navigation when folder structure isn't important + + + ### Destination Types Mk Notes supports two types of destinations: @@ -274,6 +286,46 @@ This command will: This is useful when you want to completely reset your synchronization and start fresh with new page IDs. +#### Syncing with Flat Structure + +```bash +mk-notes sync \ + --input ./my-docs \ + --destination https://notion.so/myworkspace/doc-123456 \ + --notion-api-key secret_abc123... \ + --flat +``` + +This command will: + +1. Read all markdown files in the `./my-docs` directory +2. **Flatten the structure** - all pages will be created as direct children of the destination page +3. Skip synchronizing the root node (if your directory has an `index.md`, it will be created as a child page instead of updating the parent) +4. Display a success message with the Notion page URL when complete + +**Example structure:** + +Without `--flat`: + +``` +Parent Page + └── docs/ + ├── getting-started.md + └── guides/ + └── installation.md +``` + +With `--flat`: + +``` +Parent Page + ├── docs/ (root copy) + ├── getting-started.md + └── installation.md +``` + +All files are now direct children, making navigation simpler. + ## `preview-sync` The `preview-sync` command lets you preview how your markdown files will be organized in Notion before actually performing the synchronization. This is useful for verifying the structure before making any changes. @@ -294,6 +346,7 @@ mk-notes preview-sync -i [options] - `plainText` (default): Shows a tree-like structure - `json`: Outputs the structure in JSON format - `-o, --output `: Save the preview to a file instead of displaying it in the terminal +- `--flat`: Flatten the preview structure, showing how pages will appear when using the `--flat` option in sync. All pages will be displayed as direct children of the root. ### Examples @@ -341,6 +394,14 @@ mk-notes preview-sync --input ./my-docs --output preview.txt mk-notes preview-sync --input ./my-docs/README.md ``` +5. Preview with flat structure: + +```bash +mk-notes preview-sync --input ./my-docs --flat +``` + +This shows how the structure will look when using `--flat` during sync, with all pages as direct children of the root. + --- ## Single File vs Directory Synchronization diff --git a/docs/content/docs/cli/guides/database-sync.mdx b/docs/content/docs/cli/guides/database-sync.mdx index c2369ab..ad2803d 100644 --- a/docs/content/docs/cli/guides/database-sync.mdx +++ b/docs/content/docs/cli/guides/database-sync.mdx @@ -148,6 +148,31 @@ mk-notes sync -i ./docs -d -k --clean --force-new +### Flat Structure + +The `--flat` option flattens the page tree, making all pages direct children of the database instead of maintaining nested folder structures: + +```bash +mk-notes sync \ + --input ./docs \ + --destination https://notion.so/myworkspace/database-123456 \ + --notion-api-key secret_abc123... \ + --flat +``` + +When syncing to databases, `--flat` is particularly useful because: + +- All pages appear at the same level in the database +- Easier to filter and sort all documents uniformly +- No nested hierarchy to navigate +- Better for database views where you want all items visible at once + + + +When using `--flat` with database sync, the root node is skipped and all files become direct database items. This creates a clean, flat list of all your documents in the database, making them easier to manage with database views and filters. + + + ### Combining Options You can combine `--save-id` and `--force-new` with `--clean` for different workflows: @@ -156,8 +181,10 @@ You can combine `--save-id` and `--force-new` with `--clean` for different workf | ------------------------------- | ---------------------------------------------------------- | | `--save-id` | Updates existing pages by ID, saves new IDs to frontmatter | | `--force-new` | Creates new pages, ignores existing IDs | +| `--flat` | Flattens structure, all pages become direct children | | `--clean --save-id` | Deletes old pages, creates new ones, saves new IDs | | `--clean --force-new --save-id` | Full reset: deletes all, creates new pages, saves new IDs | +| `--flat --save-id` | Flat structure with incremental updates | --- diff --git a/docs/content/docs/github-actions/available-actions/2-sync-action.mdx b/docs/content/docs/github-actions/available-actions/2-sync-action.mdx index 1c1c6b7..291374a 100644 --- a/docs/content/docs/github-actions/available-actions/2-sync-action.mdx +++ b/docs/content/docs/github-actions/available-actions/2-sync-action.mdx @@ -31,15 +31,16 @@ steps: ## Inputs -| Input | Description | Required | Default | -| ---------------- | --------------------------------------------------------------------------------------------------------------------- | -------- | ------- | -| `input` | The path to the markdown file or directory to synchronize | `true` | - | -| `destination` | The Notion page or database URL where you want to synchronize your markdown files | `true` | - | -| `notion-api-key` | Your Notion secret token | `true` | - | -| `clean` | Clean sync mode - removes ALL existing content from the destination before syncing | `false` | `false` | -| `lock` | Lock the Notion page after syncing | `false` | `false` | -| `save-id` | Save Notion page IDs back to the source markdown files' frontmatter, enabling incremental updates on subsequent syncs | `false` | `false` | -| `force-new` | Force creation of new pages, ignoring any existing page IDs in the markdown frontmatter | `false` | `false` | +| Input | Description | Required | Default | +| ---------------- | -------------------------------------------------------------------------------------------------------------------------- | -------- | ------- | +| `input` | The path to the markdown file or directory to synchronize | `true` | - | +| `destination` | The Notion page or database URL where you want to synchronize your markdown files | `true` | - | +| `notion-api-key` | Your Notion secret token | `true` | - | +| `clean` | Clean sync mode - removes ALL existing content from the destination before syncing | `false` | `false` | +| `lock` | Lock the Notion page after syncing | `false` | `false` | +| `save-id` | Save Notion page IDs back to the source markdown files' frontmatter, enabling incremental updates on subsequent syncs | `false` | `false` | +| `force-new` | Force creation of new pages, ignoring any existing page IDs in the markdown frontmatter | `false` | `false` | +| `flat` | Flatten the result page tree, making all pages direct children of the destination instead of maintaining nested structures | `false` | `false` | ## Destination Types @@ -294,11 +295,47 @@ jobs: git push ``` +### Sync with Flat Structure + +Use `flat` to create a flat list of pages instead of maintaining nested folder structures: + +```yaml +name: Flat Sync + +on: + push: + branches: [main] + paths: ['docs/**'] + +jobs: + flat-sync: + runs-on: ubuntu-latest + steps: + - name: Checkout repository + uses: actions/checkout@v4 + + - name: Sync with flat structure + uses: Myastr0/mk-notes/sync + with: + input: './docs' + destination: ${{ secrets.NOTION_DOCS_PAGE_URL }} + notion-api-key: ${{ secrets.NOTION_API_KEY }} + flat: 'true' +``` + +When `flat: 'true'` is used: + +- All pages are created as direct children of the destination page +- The root node is skipped (if your directory has an `index.md`, it becomes a child page) +- Folder hierarchy is flattened into a simple list +- Useful for creating simple document lists or syncing to databases where you want all items at the same level + ## Important Notes - **For clean sync**: The `clean` option removes ALL existing content from the destination before syncing - **For save-id**: Remember to commit the updated markdown files back to your repository - **For force-new**: Use with caution as it may create duplicate pages if used without `clean` +- **For flat**: When using `flat`, the root node is skipped and all files become direct children. This is ideal for simple document lists or database syncs where hierarchy isn't needed - Use clean sync only when you're sure you want to replace content - Always test with the [preview action](./1-preview-action.mdx) first - Make sure your Notion integration has access to the target page or database diff --git a/preview/index.js b/preview/index.js index e18504e..cc986f5 100644 --- a/preview/index.js +++ b/preview/index.js @@ -75020,7 +75020,7 @@ class MkNotes { /** * Synchronize a markdown file to Notion */ - async synchronizeMarkdownToNotionFromFileSystem({ inputPath, parentNotionPageId, cleanSync = false, lockPage = false, saveId = false, forceNew = false, }) { + async synchronizeMarkdownToNotionFromFileSystem({ inputPath, parentNotionPageId, cleanSync = false, lockPage = false, saveId = false, forceNew = false, flatten = false, }) { const synchronizeMarkdownToNotion = new domains_1.SynchronizeMarkdownToNotion({ logger: this.logger, destinationRepository: this.infrastructureInstances.notionDestination, @@ -75034,6 +75034,7 @@ class MkNotes { lockPage, saveId, forceNew, + flatten, }); } } @@ -76137,6 +76138,27 @@ class SiteMap { this.removeUselessNodesTree(this._root); this.traverseAndUpdate(this._root); } + /** + * Returns a new SiteMap instance with the tree flattened under the root node + */ + flatten() { + const flattenedSiteMap = new SiteMap(); + const flattenedChildren = this._root.flatten(); + // Create a copy of the original root node WITHOUT its children + // (just the root node itself, not its nested structure) + const rootCopy = new TreeNode_1.TreeNode({ + id: this._root.id, + name: this._root.name, + filepath: this._root.filepath, + children: [], + parent: flattenedSiteMap._root, + }); + // Create deep copies of all flattened children and set their parent to the new root + const flattenedChildrenCopies = flattenedChildren.map((child) => TreeNode_1.TreeNode.fromJSON(child.toJSON(), flattenedSiteMap._root)); + // Set children: original root copy (without nested children) first, then all flattened descendants copies + flattenedSiteMap._root.children = [rootCopy, ...flattenedChildrenCopies]; + return flattenedSiteMap; + } /** * TODO: Implement mkdocs.yaml sitemap parsing * @@ -76216,6 +76238,17 @@ class TreeNode { node.children = json.children.map((child) => this.fromJSON(child, node)); return node; } + flatten() { + // Recursively flatten all children + const flattenedChildren = this.children.reduce((acc, child) => acc.concat(child.flatten()), []); + // If the node is the root node, return only the flattened children + // (the root itself is not included as it's already in the SiteMap) + if (this.parent === null) { + return flattenedChildren; + } + // For non-root nodes, include this node followed by its flattened children + return [this, ...flattenedChildren]; + } } exports.TreeNode = TreeNode; @@ -76379,7 +76412,7 @@ class PreviewSynchronization { constructor(params) { this.sourceRepository = params.sourceRepository; } - async execute(args, { format } = {}) { + async execute(args, { format, flatten } = {}) { // Check if the source repository is accessible try { await this.sourceRepository.sourceIsAccessible(args); @@ -76406,7 +76439,10 @@ class PreviewSynchronization { } } const filePaths = await this.sourceRepository.getFilePathList(args); - const siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + let siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + if (flatten) { + siteMap = siteMap.flatten(); + } return sitemapSerializer(siteMap); } } @@ -76436,7 +76472,7 @@ class SynchronizeMarkdownToNotion { this.logger = params.logger; } async execute(args) { - const { notionParentPageUrl, cleanSync, lockPage, saveId, forceNew, ...others } = args; + const { notionParentPageUrl, cleanSync, lockPage, saveId, forceNew, flatten, ...others } = args; const notionObjectId = this.destinationRepository.getObjectIdFromObjectUrl({ objectUrl: notionParentPageUrl, }); @@ -76465,7 +76501,10 @@ class SynchronizeMarkdownToNotion { try { this.logger.info('Starting synchronization process'); const filePaths = await this.sourceRepository.getFilePathList(others); - const siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + let siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + if (flatten) { + siteMap = siteMap.flatten(); + } // Traverse the SiteMap and synchronize files const pages = await this.synchronizeTreeNode({ node: siteMap.root, @@ -76474,6 +76513,7 @@ class SynchronizeMarkdownToNotion { lockPage, cleanSync, forceNew, + flatten, }); this.logger.info('Synchronization process completed successfully'); if (saveId) { @@ -76534,24 +76574,29 @@ class SynchronizeMarkdownToNotion { /** * Main orchestrator for synchronizing a tree node and its children */ - async synchronizeTreeNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, }) { + async synchronizeTreeNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, flatten, }) { this.validateParentObjectType(parentObjectType); const nodeToSync = this.getNodeToSynchronize(node, parentObjectType); const results = []; - const { page: rootPageElement, treeNodeId: rootTreeNodeId } = await this.synchronizeRootNode({ - node: nodeToSync, - parentObjectId, - parentObjectType, - lockPage, - cleanSync, - forceNew, - }); - results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + let rootPageElement; + // If not flattening, synchronize the root node + if (!flatten) { + const { page: rootPageElement, treeNodeId: rootTreeNodeId } = await this.synchronizeRootNode({ + node: nodeToSync, + parentObjectId, + parentObjectType, + lockPage, + cleanSync, + forceNew, + flatten, + }); + results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + } for (const childNode of node.children) { try { const childResults = await this.synchronizeChildNode({ childNode, - parentPageId: rootPageElement.id, + parentPageId: rootPageElement?.id ?? parentObjectId, lockPage, forceNew, }); @@ -76570,7 +76615,7 @@ class SynchronizeMarkdownToNotion { * Synchronizes the root node to the parent object (page or database) * Returns the page ID to use as parent for child nodes */ - async synchronizeRootNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, }) { + async synchronizeRootNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, flatten, }) { this.logger.info(`Adding content from ${node.filepath} to parent ${parentObjectType}`); const pageElement = await this.fetchAndConvertToPageElement(node.filepath, { forceNew, @@ -76590,39 +76635,63 @@ class SynchronizeMarkdownToNotion { }; } } - if (parentObjectType === 'unknown') { - throw new Error('Parent object type is unknown'); - } - if (parentObjectType === 'page') { - // If clean sync is enabled, delete all existing content first - if (cleanSync) { - this.logger.info('Clean sync enabled - removing existing content'); - try { - await this.destinationRepository.deleteChildBlocks({ - parentPageId: parentObjectId, - }); - this.logger.info('Successfully removed existing content'); - } - catch (error) { - this.logger.warn('Failed to remove existing content, continuing with sync', { error }); - } - const newPage = await this.destinationRepository.createPage({ + switch (parentObjectType) { + case 'page': + return await this.synchronizeRootNodeWithParentPage({ + node, + parentObjectId, + parentObjectType, pageElement, + lockPage, + cleanSync, + flatten, + }); + case 'database': + return await this.synchronizeRootNodeWithParentDatabase({ + node, parentObjectId, parentObjectType, + pageElement, + cleanSync, }); - pageElement.id = newPage.pageId; - return { page: pageElement, treeNodeId: node.id }; + case 'unknown': + default: + throw new Error(`Invalid parent object type: ${parentObjectType}`); + } + } + async synchronizeRootNodeWithParentPage({ node, parentObjectId, parentObjectType, pageElement, lockPage, cleanSync, flatten, }) { + if (cleanSync) { + this.logger.info('Clean sync enabled - removing existing content'); + try { + await this.destinationRepository.deleteChildBlocks({ + parentPageId: parentObjectId, + }); + this.logger.info('Successfully removed existing content'); + } + catch (error) { + this.logger.warn('Failed to remove existing content, continuing with sync', { error }); } - const updatedPage = await this.destinationRepository.updatePage({ - pageId: parentObjectId, + } + if (flatten) { + const newPage = await this.destinationRepository.createPage({ pageElement, + parentObjectId, + parentObjectType, }); - pageElement.id = updatedPage.pageId; - this.logger.info(`Updated parent page ${parentObjectId}`); - await this.lockPageIfNeeded(parentObjectId, lockPage); + pageElement.id = newPage.pageId; return { page: pageElement, treeNodeId: node.id }; } + const updatedPage = await this.destinationRepository.updatePage({ + pageId: parentObjectId, + pageElement, + }); + pageElement.id = updatedPage.pageId; + this.logger.info(`Updated parent page ${parentObjectId}`); + await this.lockPageIfNeeded(parentObjectId, lockPage); + return { page: pageElement, treeNodeId: node.id }; + } + async synchronizeRootNodeWithParentDatabase({ node, parentObjectId, parentObjectType, pageElement, cleanSync, }) { + // If clean sync is enabled, delete all existing content first if (cleanSync) { await this.destinationRepository.deleteChildBlocks({ parentPageId: parentObjectId, @@ -76679,7 +76748,7 @@ class SynchronizeMarkdownToNotion { treeNodeId: childNode.id, }); // Recursively process children - for (const grandChild of childNode.children) { + await Promise.all(childNode.children.map(async (grandChild) => { const grandChildSyncResult = await this.synchronizeChildNode({ childNode: grandChild, parentPageId: pageElement.id, @@ -76687,7 +76756,7 @@ class SynchronizeMarkdownToNotion { forceNew, }); syncResult.push(...grandChildSyncResult); - } + })); await this.lockPageIfNeeded(pageElement.id, lockPage); return syncResult; } diff --git a/src/MkNotes.test.ts b/src/MkNotes.test.ts index 212f3fd..d92d61d 100644 --- a/src/MkNotes.test.ts +++ b/src/MkNotes.test.ts @@ -58,6 +58,18 @@ describe('MkNotes', () => { expect(writeFileSpy).toHaveBeenCalledWith(outputPath, expect.any(String)); }); + + it('should support flatten option', async () => { + const result = await mkNotes.previewSynchronization({ + inputPath: 'fake/input/path.md', + format: 'json', + flatten: true, + }); + + expect(result).toBeTruthy(); + expect(typeof result).toBe('string'); + expect(JSON.parse(result)).toEqual(expect.any(Object)); + }); }); describe('synchronizeMarkdownToNotionFromFileSystem', () => { @@ -77,11 +89,35 @@ describe('MkNotes', () => { lockPage: false, saveId: false, forceNew: false, + flatten: false, }); // Without cleanSync, the root file updates the parent page expect(updatePageSpy).toHaveBeenCalled(); }); + + it('should support flatten option', async () => { + const updatePageSpy = jest.spyOn( + infrastructureInstances.notionDestination, + 'updatePage' + ); + + const notionPageUrl = + 'https://www.notion.so/workspace/Test-Page-12345678901234567890123456789012'; + + await mkNotes.synchronizeMarkdownToNotionFromFileSystem({ + inputPath: 'fake/input/path.md', + parentNotionPageId: notionPageUrl, + cleanSync: false, + lockPage: false, + saveId: false, + forceNew: false, + flatten: true, + }); + + // With flatten, root node is skipped, so updatePage should not be called + expect(updatePageSpy).not.toHaveBeenCalled(); + }); }); describe('constructor', () => { diff --git a/src/MkNotes.ts b/src/MkNotes.ts index 9ca410b..6f36dc5 100644 --- a/src/MkNotes.ts +++ b/src/MkNotes.ts @@ -2,8 +2,8 @@ import * as fs from 'fs'; import winston from 'winston'; import { - PreviewFormat, PreviewSynchronization, + PreviewSynchronizationOptions, SynchronizeMarkdownToNotion, SynchronizeOptions, } from '@/domains'; @@ -14,12 +14,6 @@ import { import { LogLevel } from './domains/logger/types'; -export interface SyncOptions { - cleanSync: boolean; - lockPage: boolean; - saveId: boolean; -} - /** * MkNotes client */ @@ -57,9 +51,8 @@ export class MkNotes { output, }: { inputPath: string; - format: PreviewFormat; output?: string; - }): Promise { + } & PreviewSynchronizationOptions): Promise { const previewSynchronizationFeature = new PreviewSynchronization({ sourceRepository: this.infrastructureInstances.fileSystemSource, }); @@ -92,6 +85,7 @@ export class MkNotes { lockPage = false, saveId = false, forceNew = false, + flatten = false, }: { inputPath: string; parentNotionPageId: string; @@ -110,6 +104,7 @@ export class MkNotes { lockPage, saveId, forceNew, + flatten, }); } } diff --git a/src/bin/cli/commands/preview.ts b/src/bin/cli/commands/preview.ts index 9c0de72..edf7cd9 100644 --- a/src/bin/cli/commands/preview.ts +++ b/src/bin/cli/commands/preview.ts @@ -51,6 +51,8 @@ command.addOption( .default('plainText') ); +command.option('--flat', 'Flatten the result page tree'); + command.option('-o, --output ', 'Output file path'); command.option('-v, --verbosity ', 'Verbosity level', 'error'); @@ -60,9 +62,16 @@ interface PreviewOptions { format: PreviewFormat; output?: string; verbosity?: string; + flat?: boolean; } command.action(async (opts: PreviewOptions) => { - const { input: directoryPath, format, output, verbosity = 'error' } = opts; + const { + input: directoryPath, + format, + output, + flat = false, + verbosity = 'error', + } = opts; if (!isValidVerbosity(verbosity)) { throw new Error(`Invalid verbosity: ${verbosity}`); @@ -77,6 +86,7 @@ command.action(async (opts: PreviewOptions) => { inputPath: directoryPath, format, output, + flatten: flat, }); // eslint-disable-next-line no-console diff --git a/src/bin/cli/commands/sync.ts b/src/bin/cli/commands/sync.ts index c450c75..61bd9c2 100644 --- a/src/bin/cli/commands/sync.ts +++ b/src/bin/cli/commands/sync.ts @@ -38,6 +38,8 @@ command.option('-s, --save-id', 'Save the page ID to the source repository'); command.option('-f, --force-new', 'Force a new page to be created'); +command.option('--flat', 'Flatten the result page tree'); + command.option('-v, --verbosity ', 'Verbosity level', 'error'); interface SyncOptions { input: string; @@ -48,6 +50,7 @@ interface SyncOptions { saveId?: boolean; verbosity?: string; forceNew?: boolean; + flat?: boolean; } command.action(async (opts: SyncOptions) => { @@ -60,6 +63,7 @@ command.action(async (opts: SyncOptions) => { saveId = false, forceNew = false, verbosity = 'error', + flat = false, } = opts; if (!isValidVerbosity(verbosity)) { @@ -78,6 +82,7 @@ command.action(async (opts: SyncOptions) => { lockPage: lock, saveId: saveId, forceNew: forceNew, + flatten: flat, }); // eslint-disable-next-line no-console diff --git a/src/bin/github-actions/sync.ts b/src/bin/github-actions/sync.ts index f232a5c..731e687 100644 --- a/src/bin/github-actions/sync.ts +++ b/src/bin/github-actions/sync.ts @@ -12,6 +12,7 @@ enum Inputs { Lock = 'lock', // Lock page SaveId = 'save-id', // Save ID ForceNew = 'force-new', // Force new + Flat = 'flat', // Flatten the result page tree } export const sync = async (earlyExit: boolean = false) => { @@ -23,7 +24,7 @@ export const sync = async (earlyExit: boolean = false) => { const lock = getInputAsBool(Inputs.Lock) ?? false; const saveId = getInputAsBool(Inputs.SaveId); const forceNew = getInputAsBool(Inputs.ForceNew); - + const flat = getInputAsBool(Inputs.Flat); const mkNotes = new MkNotes({ notionApiKey, }); @@ -35,6 +36,7 @@ export const sync = async (earlyExit: boolean = false) => { lockPage: lock, saveId: saveId, forceNew: forceNew, + flatten: flat, }); // node will stay alive if any promises are not resolved, diff --git a/src/domains/sitemap/entities/SiteMap.ts b/src/domains/sitemap/entities/SiteMap.ts index dda04e1..c0d191c 100644 --- a/src/domains/sitemap/entities/SiteMap.ts +++ b/src/domains/sitemap/entities/SiteMap.ts @@ -146,6 +146,34 @@ export class SiteMap { this.traverseAndUpdate(this._root); } + /** + * Returns a new SiteMap instance with the tree flattened under the root node + */ + public flatten(): SiteMap { + const flattenedSiteMap = new SiteMap(); + const flattenedChildren = this._root.flatten(); + + // Create a copy of the original root node WITHOUT its children + // (just the root node itself, not its nested structure) + const rootCopy = new TreeNode({ + id: this._root.id, + name: this._root.name, + filepath: this._root.filepath, + children: [], + parent: flattenedSiteMap._root, + }); + + // Create deep copies of all flattened children and set their parent to the new root + const flattenedChildrenCopies = flattenedChildren.map((child) => + TreeNode.fromJSON(child.toJSON(), flattenedSiteMap._root) + ); + + // Set children: original root copy (without nested children) first, then all flattened descendants copies + flattenedSiteMap._root.children = [rootCopy, ...flattenedChildrenCopies]; + + return flattenedSiteMap; + } + /** * TODO: Implement mkdocs.yaml sitemap parsing * diff --git a/src/domains/sitemap/entities/TreeNode.ts b/src/domains/sitemap/entities/TreeNode.ts index 7c89ed4..45d1b88 100644 --- a/src/domains/sitemap/entities/TreeNode.ts +++ b/src/domains/sitemap/entities/TreeNode.ts @@ -57,4 +57,21 @@ export class TreeNode { return node; } + + public flatten(): TreeNode[] { + // Recursively flatten all children + const flattenedChildren = this.children.reduce( + (acc, child) => acc.concat(child.flatten()), + [] as TreeNode[] + ); + + // If the node is the root node, return only the flattened children + // (the root itself is not included as it's already in the SiteMap) + if (this.parent === null) { + return flattenedChildren; + } + + // For non-root nodes, include this node followed by its flattened children + return [this, ...flattenedChildren]; + } } diff --git a/src/domains/sitemap/entities/__tests__/SiteMap.test.ts b/src/domains/sitemap/entities/__tests__/SiteMap.test.ts index 7c1edfd..a2bbcff 100644 --- a/src/domains/sitemap/entities/__tests__/SiteMap.test.ts +++ b/src/domains/sitemap/entities/__tests__/SiteMap.test.ts @@ -1,4 +1,5 @@ import { SiteMap } from '../SiteMap'; +import { TreeNode } from '../TreeNode'; describe('SiteMap', () => { describe('buildFromFilePaths', () => { @@ -135,4 +136,123 @@ describe('SiteMap', () => { expect(() => SiteMap.fromJSON({})).toThrow('Invalid data'); }); }); + + describe('flatten', () => { + it('should return a new SiteMap instance', () => { + const siteMap = SiteMap.buildFromFilePaths(['file1.md', 'file2.md']); + const flattened = siteMap.flatten(); + + expect(flattened).toBeInstanceOf(SiteMap); + expect(flattened).not.toBe(siteMap); + }); + + it('should flatten a simple tree structure', () => { + const siteMap = SiteMap.buildFromFilePaths([ + 'folder1/file1.md', + 'folder1/file2.md', + ]); + + const flattened = siteMap.flatten(); + + // Root should have: root copy + flattened children + expect(flattened.root.children.length).toBeGreaterThan(0); + + // The root copy should be first + const rootCopy = flattened.root.children[0]; + expect(rootCopy.id).toBe(siteMap.root.id); + expect(rootCopy.name).toBe(siteMap.root.name); + expect(rootCopy.filepath).toBe(siteMap.root.filepath); + expect(rootCopy.children).toEqual([]); + }); + + it('should flatten nested folder structure', () => { + const siteMap = SiteMap.buildFromFilePaths([ + 'level1/level2/level3/file1.md', + 'level1/level2/file2.md', + 'level1/file3.md', + ]); + + const flattened = siteMap.flatten(); + + // All files should be direct children of root (after root copy) + const childrenAfterRoot = flattened.root.children.slice(1); + + // Should have root copy + all flattened descendants + expect(flattened.root.children.length).toBeGreaterThan(1); + + // Verify root copy exists + expect(flattened.root.children[0].id).toBe(siteMap.root.id); + }); + + it('should preserve filepaths in flattened structure', () => { + const filePaths = [ + 'docs/getting-started.md', + 'docs/advanced/features.md', + 'docs/advanced/configuration.md', + ]; + + const siteMap = SiteMap.buildFromFilePaths(filePaths); + const flattened = siteMap.flatten(); + + // Collect all filepaths from flattened structure + const collectFilepaths = (node: TreeNode): string[] => { + const paths: string[] = []; + if (node.filepath) { + paths.push(node.filepath); + } + node.children.forEach((child) => { + paths.push(...collectFilepaths(child)); + }); + return paths; + }; + + const flattenedPaths = collectFilepaths(flattened.root); + + // All original filepaths should be present + filePaths.forEach((path) => { + expect(flattenedPaths).toContain(path); + }); + }); + + it('should handle empty sitemap', () => { + const siteMap = SiteMap.buildFromFilePaths([]); + const flattened = siteMap.flatten(); + + expect(flattened.root.children).toHaveLength(1); // Only root copy + expect(flattened.root.children[0].id).toBe(siteMap.root.id); + }); + + it('should not modify the original sitemap', () => { + const filePaths = ['folder1/file1.md', 'folder1/file2.md']; + const siteMap = SiteMap.buildFromFilePaths(filePaths); + const originalRootChildrenCount = siteMap.root.children.length; + + const flattened = siteMap.flatten(); + + // Original should remain unchanged + expect(siteMap.root.children.length).toBe(originalRootChildrenCount); + + // Flattened should have different structure + expect(flattened.root.children.length).not.toBe(originalRootChildrenCount); + }); + + it('should set correct parent references in flattened structure', () => { + const siteMap = SiteMap.buildFromFilePaths([ + 'folder1/file1.md', + 'folder1/subfolder/file2.md', + ]); + + const flattened = siteMap.flatten(); + + // All children should have root as parent + const verifyParent = (node: TreeNode) => { + node.children.forEach((child) => { + expect(child.parent).toBe(node); + verifyParent(child); + }); + }; + + verifyParent(flattened.root); + }); + }); }); diff --git a/src/domains/sitemap/entities/__tests__/TreeNode.test.ts b/src/domains/sitemap/entities/__tests__/TreeNode.test.ts index cefe57a..1d4a9e6 100644 --- a/src/domains/sitemap/entities/__tests__/TreeNode.test.ts +++ b/src/domains/sitemap/entities/__tests__/TreeNode.test.ts @@ -144,4 +144,158 @@ describe('TreeNode', () => { expect(root.parent).toBeNull(); }); }); + + describe('flatten', () => { + it('should return empty array for root node with no children', () => { + const root = new TreeNode({ + id: 'root', + name: 'root', + filepath: '', + parent: null, + }); + + const flattened = root.flatten(); + + expect(flattened).toEqual([]); + }); + + it('should return only children for root node (not the root itself)', () => { + const child1 = new TreeNode({ + id: 'child1', + name: 'Child 1', + filepath: '/child1.md', + }); + + const child2 = new TreeNode({ + id: 'child2', + name: 'Child 2', + filepath: '/child2.md', + }); + + const root = new TreeNode({ + id: 'root', + name: 'root', + filepath: '', + parent: null, + children: [child1, child2], + }); + + const flattened = root.flatten(); + + expect(flattened).toHaveLength(2); + expect(flattened[0]).toBe(child1); + expect(flattened[1]).toBe(child2); + expect(flattened).not.toContain(root); + }); + + it('should return node itself plus flattened children for non-root node', () => { + const grandchild = new TreeNode({ + id: 'grandchild', + name: 'Grandchild', + filepath: '/parent/child/grandchild.md', + }); + + const child = new TreeNode({ + id: 'child', + name: 'Child', + filepath: '/parent/child.md', + children: [grandchild], + }); + + const parent = new TreeNode({ + id: 'parent', + name: 'Parent', + filepath: '/parent.md', + children: [child], + }); + + // Set up a root node + const root = new TreeNode({ + id: 'root', + name: 'root', + filepath: '', + parent: null, + children: [parent], + }); + + const flattened = child.flatten(); + + expect(flattened).toHaveLength(2); + expect(flattened[0]).toBe(child); + expect(flattened[1]).toBe(grandchild); + }); + + it('should recursively flatten nested children', () => { + const level3 = new TreeNode({ + id: 'level3', + name: 'Level 3', + filepath: '/level1/level2/level3.md', + }); + + const level2 = new TreeNode({ + id: 'level2', + name: 'Level 2', + filepath: '/level1/level2.md', + children: [level3], + }); + + const level1 = new TreeNode({ + id: 'level1', + name: 'Level 1', + filepath: '/level1.md', + children: [level2], + }); + + const root = new TreeNode({ + id: 'root', + name: 'root', + filepath: '', + parent: null, + children: [level1], + }); + + const flattened = level1.flatten(); + + expect(flattened).toHaveLength(3); + expect(flattened[0]).toBe(level1); + expect(flattened[1]).toBe(level2); + expect(flattened[2]).toBe(level3); + }); + + it('should handle multiple children at the same level', () => { + const child1 = new TreeNode({ + id: 'child1', + name: 'Child 1', + filepath: '/parent/child1.md', + }); + + const child2 = new TreeNode({ + id: 'child2', + name: 'Child 2', + filepath: '/parent/child2.md', + }); + + const parent = new TreeNode({ + id: 'parent', + name: 'Parent', + filepath: '/parent.md', + children: [child1, child2], + }); + + const root = new TreeNode({ + id: 'root', + name: 'root', + filepath: '', + parent: null, + children: [parent], + }); + + const flattened = parent.flatten(); + + expect(flattened).toHaveLength(3); + expect(flattened[0]).toBe(parent); + expect(flattened[1]).toBe(child1); + expect(flattened[2]).toBe(child2); + }); + }); }); \ No newline at end of file diff --git a/src/domains/synchronization/features/__tests__/preview-synchronization.feature.test.ts b/src/domains/synchronization/features/__tests__/preview-synchronization.feature.test.ts index 5a798c2..7fa5583 100644 --- a/src/domains/synchronization/features/__tests__/preview-synchronization.feature.test.ts +++ b/src/domains/synchronization/features/__tests__/preview-synchronization.feature.test.ts @@ -91,5 +91,89 @@ describe('PreviewSynchronization', () => { expect(buildFromFilePathsSpy).toHaveBeenCalledWith(filePaths); }); + + it('should flatten the SiteMap when flatten option is true', async () => { + const filePaths = ['folder1/file1.md', 'folder1/subfolder/file2.md']; + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(filePaths); + + const flattenSpy = jest.spyOn(SiteMap.prototype, 'flatten'); + const buildFromFilePathsSpy = jest.spyOn(SiteMap, 'buildFromFilePaths'); + + await previewSync.execute({ path: 'test/path' }, { flatten: true }); + + expect(buildFromFilePathsSpy).toHaveBeenCalledWith(filePaths); + expect(flattenSpy).toHaveBeenCalled(); + }); + + it('should not flatten the SiteMap when flatten option is false', async () => { + const filePaths = ['folder1/file1.md', 'folder1/subfolder/file2.md']; + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(filePaths); + + const resultNotFlattened = await previewSync.execute( + { path: 'test/path' }, + { format: 'json', flatten: false } + ); + + const resultFlattened = await previewSync.execute( + { path: 'test/path' }, + { format: 'json', flatten: true } + ); + + const parsedNotFlattened = JSON.parse(resultNotFlattened); + const parsedFlattened = JSON.parse(resultFlattened); + + // When flattened, there should be a root copy as the first child + // When not flattened, there shouldn't be a root copy + // The flattened version should have more direct children (root copy + all flattened files) + expect(parsedFlattened.children.length).toBeGreaterThan( + parsedNotFlattened.children.length + ); + }); + + it('should not flatten the SiteMap when flatten option is not provided', async () => { + const filePaths = ['folder1/file1.md', 'folder1/subfolder/file2.md']; + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(filePaths); + + const resultNotFlattened = await previewSync.execute( + { path: 'test/path' }, + { format: 'json' } + ); + + const resultFlattened = await previewSync.execute( + { path: 'test/path' }, + { format: 'json', flatten: true } + ); + + const parsedNotFlattened = JSON.parse(resultNotFlattened); + const parsedFlattened = JSON.parse(resultFlattened); + + // When flattened, there should be more direct children + expect(parsedFlattened.children.length).toBeGreaterThan( + parsedNotFlattened.children.length + ); + }); + + it('should serialize flattened SiteMap correctly in JSON format', async () => { + const filePaths = ['folder1/file1.md', 'folder1/subfolder/file2.md']; + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(filePaths); + + const result = await previewSync.execute( + { path: 'test/path' }, + { format: 'json', flatten: true } + ); + + const parsedResult = JSON.parse(result); + expect(parsedResult).toHaveProperty('children'); + // Flattened structure should have root copy + flattened children + expect(parsedResult.children.length).toBeGreaterThan(0); + }); }); }); diff --git a/src/domains/synchronization/features/__tests__/synchronize-markdown-to-notion.feature.test.ts b/src/domains/synchronization/features/__tests__/synchronize-markdown-to-notion.feature.test.ts index 45b3e6b..4900119 100644 --- a/src/domains/synchronization/features/__tests__/synchronize-markdown-to-notion.feature.test.ts +++ b/src/domains/synchronization/features/__tests__/synchronize-markdown-to-notion.feature.test.ts @@ -462,5 +462,210 @@ describe('SynchronizeMarkdownToNotion', () => { lockPage: true, })).rejects.toThrow('Failed to lock page'); }); + + it('should skip root node synchronization when flatten is true', async () => { + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(['index.md', 'folder1/file1.md']); + + const rootContent = new PageElement({ + title: 'Root', + content: [new TextElement({ text: 'Root content' })], + }); + + const childContent = new PageElement({ + title: 'Child', + content: [new TextElement({ text: 'Child content' })], + }); + + jest + .spyOn(sourceRepository, 'getFile') + .mockImplementation(async (args: any) => { + const path = args.path; + if (path === 'index.md') { + return new FileFixture({ content: 'Root content' }); + } else if (path === 'folder1/file1.md') { + return new FileFixture({ content: 'Child content' }); + } + return new FileFixture({ content: 'Default content' }); + }); + + jest + .spyOn(elementConverter, 'convertToElement') + .mockImplementation((file: any) => { + const content = file.content; + if (content.includes('Root content')) { + return rootContent; + } else if (content.includes('Child content')) { + return childContent; + } + return new PageElement({ title: 'Default', content: [] }); + }); + + const updatePageSpy = jest.spyOn(destinationRepository, 'updatePage'); + const createPageSpy = jest.spyOn(destinationRepository, 'createPage'); + + await synchronizer.execute({ + ...defaultArgs, + flatten: true, + }); + + // Root node should not be synchronized (no updatePage call) + expect(updatePageSpy).not.toHaveBeenCalled(); + + // When flattening, the root copy is created as the first child, then the actual child + // So we get: root copy (from index.md) + folder1/file1.md = 2 pages + expect(createPageSpy).toHaveBeenCalledTimes(2); + // First call should be for the root copy + expect(createPageSpy).toHaveBeenNthCalledWith(1, { + pageElement: expect.objectContaining({ + title: 'Root', + }), + parentObjectId: '12345678901234567890123456789012', + parentObjectType: 'page', + }); + // Second call should be for the child + expect(createPageSpy).toHaveBeenNthCalledWith(2, { + pageElement: expect.objectContaining({ + title: 'Child', + }), + parentObjectId: '12345678901234567890123456789012', + parentObjectType: 'page', + }); + }); + + it('should synchronize root node when flatten is false', async () => { + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue(['index.md', 'folder1/file1.md']); + + const rootContent = new PageElement({ + title: 'Root', + content: [new TextElement({ text: 'Root content' })], + }); + + const childContent = new PageElement({ + title: 'Child', + content: [new TextElement({ text: 'Child content' })], + }); + + jest + .spyOn(sourceRepository, 'getFile') + .mockImplementation(async (args: any) => { + const path = args.path; + if (path === 'index.md') { + return new FileFixture({ content: 'Root content' }); + } else if (path === 'folder1/file1.md') { + return new FileFixture({ content: 'Child content' }); + } + return new FileFixture({ content: 'Default content' }); + }); + + jest + .spyOn(elementConverter, 'convertToElement') + .mockImplementation((file: any) => { + const content = file.content; + if (content.includes('Root content')) { + return rootContent; + } else if (content.includes('Child content')) { + return childContent; + } + return new PageElement({ title: 'Default', content: [] }); + }); + + const updatePageSpy = jest.spyOn(destinationRepository, 'updatePage'); + const createPageSpy = jest.spyOn(destinationRepository, 'createPage'); + + await synchronizer.execute({ + ...defaultArgs, + flatten: false, + }); + + // Root node should be synchronized (updatePage call) + expect(updatePageSpy).toHaveBeenCalledTimes(1); + expect(updatePageSpy).toHaveBeenCalledWith({ + pageId: '12345678901234567890123456789012', + pageElement: rootContent, + }); + + // Child should be created under root + expect(createPageSpy).toHaveBeenCalledTimes(1); + }); + + it('should flatten SiteMap before synchronization when flatten is true', async () => { + jest + .spyOn(sourceRepository, 'getFilePathList') + .mockResolvedValue([ + 'folder1/file1.md', + 'folder1/subfolder/file2.md', + ]); + + const file1Content = new PageElement({ + title: 'File 1', + content: [new TextElement({ text: 'File 1 content' })], + }); + + const file2Content = new PageElement({ + title: 'File 2', + content: [new TextElement({ text: 'File 2 content' })], + }); + + jest + .spyOn(sourceRepository, 'getFile') + .mockImplementation(async (args: any) => { + const path = args.path; + if (path === 'folder1/file1.md') { + return new FileFixture({ content: 'File 1 content' }); + } else if (path === 'folder1/subfolder/file2.md') { + return new FileFixture({ content: 'File 2 content' }); + } + return new FileFixture({ content: 'Default content' }); + }); + + jest + .spyOn(elementConverter, 'convertToElement') + .mockImplementation((file: any) => { + const content = file.content; + if (content.includes('File 1 content')) { + return file1Content; + } else if (content.includes('File 2 content')) { + return file2Content; + } + return new PageElement({ title: 'Default', content: [] }); + }); + + const createPageSpy = jest.spyOn(destinationRepository, 'createPage'); + + await synchronizer.execute({ + ...defaultArgs, + flatten: true, + }); + + // When flattening, the root copy is created as the first child, then all flattened files + // So we get: root copy (empty) + folder1/file1.md + folder1/subfolder/file2.md = 3 pages + expect(createPageSpy).toHaveBeenCalledTimes(3); + // First call should be for the root copy + expect(createPageSpy).toHaveBeenNthCalledWith(1, { + pageElement: expect.any(PageElement), + parentObjectId: '12345678901234567890123456789012', + parentObjectType: 'page', + }); + // Second call should be for File 1 + expect(createPageSpy).toHaveBeenNthCalledWith(2, { + pageElement: expect.objectContaining({ + title: 'File 1', + }), + parentObjectId: '12345678901234567890123456789012', + parentObjectType: 'page', + }); + // Third call should be for File 2 + expect(createPageSpy).toHaveBeenNthCalledWith(3, { + pageElement: expect.objectContaining({ + title: 'File 2', + }), + parentObjectId: '12345678901234567890123456789012', + parentObjectType: 'page', + }); + }); }); }); diff --git a/src/domains/synchronization/features/preview-synchronization.feature.ts b/src/domains/synchronization/features/preview-synchronization.feature.ts index 664f527..4eeba1d 100644 --- a/src/domains/synchronization/features/preview-synchronization.feature.ts +++ b/src/domains/synchronization/features/preview-synchronization.feature.ts @@ -18,6 +18,14 @@ export const isValidFormat = (format: unknown): format is PreviewFormat => { ); }; +export interface PreviewSynchronizationOptions { + /** The format of the preview */ + format?: PreviewFormat; + + /** When true, flatten the result page tree */ + flatten?: boolean; +} + export class PreviewSynchronization { private sourceRepository: SourceRepository; @@ -27,7 +35,7 @@ export class PreviewSynchronization { async execute( args: T, - { format }: { format?: PreviewFormat; output?: string } = {} + { format, flatten }: PreviewSynchronizationOptions = {} ): Promise { // Check if the source repository is accessible try { @@ -56,7 +64,11 @@ export class PreviewSynchronization { const filePaths = await this.sourceRepository.getFilePathList(args); - const siteMap = SiteMap.buildFromFilePaths(filePaths); + let siteMap = SiteMap.buildFromFilePaths(filePaths); + + if (flatten) { + siteMap = siteMap.flatten(); + } return sitemapSerializer(siteMap); } diff --git a/src/domains/synchronization/features/synchronize-markdown-to-notion.feature.ts b/src/domains/synchronization/features/synchronize-markdown-to-notion.feature.ts index a1d1cae..35477ad 100644 --- a/src/domains/synchronization/features/synchronize-markdown-to-notion.feature.ts +++ b/src/domains/synchronization/features/synchronize-markdown-to-notion.feature.ts @@ -35,6 +35,9 @@ export interface SynchronizeOptions { /** When true, force a new page to be created */ forceNew: boolean; + + /** When true, flatten the site map */ + flatten: boolean; } export type SynchronizationResult = { @@ -65,6 +68,7 @@ export class SynchronizeMarkdownToNotion { lockPage, saveId, forceNew, + flatten, ...others } = args; @@ -106,7 +110,11 @@ export class SynchronizeMarkdownToNotion { others as T ); - const siteMap = SiteMap.buildFromFilePaths(filePaths); + let siteMap = SiteMap.buildFromFilePaths(filePaths); + + if (flatten) { + siteMap = siteMap.flatten(); + } // Traverse the SiteMap and synchronize files const pages = await this.synchronizeTreeNode({ @@ -116,6 +124,7 @@ export class SynchronizeMarkdownToNotion { lockPage, cleanSync, forceNew, + flatten, }); this.logger.info('Synchronization process completed successfully'); @@ -201,6 +210,7 @@ export class SynchronizeMarkdownToNotion { lockPage, cleanSync, forceNew, + flatten, }: { node: TreeNode; parentObjectId: string; @@ -208,6 +218,7 @@ export class SynchronizeMarkdownToNotion { lockPage: boolean; cleanSync: boolean; forceNew: boolean; + flatten: boolean; }): Promise { this.validateParentObjectType(parentObjectType); @@ -215,23 +226,29 @@ export class SynchronizeMarkdownToNotion { const results: SynchronizationResult[] = []; - const { page: rootPageElement, treeNodeId: rootTreeNodeId } = - await this.synchronizeRootNode({ - node: nodeToSync, - parentObjectId, - parentObjectType, - lockPage, - cleanSync, - forceNew, - }); + let rootPageElement: PageElement | undefined; + + // If not flattening, synchronize the root node + if (!flatten) { + const { page: rootPageElement, treeNodeId: rootTreeNodeId } = + await this.synchronizeRootNode({ + node: nodeToSync, + parentObjectId, + parentObjectType, + lockPage, + cleanSync, + forceNew, + flatten, + }); - results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + } for (const childNode of node.children) { try { const childResults = await this.synchronizeChildNode({ childNode, - parentPageId: rootPageElement.id!, + parentPageId: rootPageElement?.id ?? parentObjectId, lockPage, forceNew, }); @@ -258,6 +275,7 @@ export class SynchronizeMarkdownToNotion { lockPage, cleanSync, forceNew, + flatten, }: { node: TreeNode; parentObjectId: string; @@ -265,6 +283,7 @@ export class SynchronizeMarkdownToNotion { lockPage: boolean; cleanSync: boolean; forceNew: boolean; + flatten: boolean; }): Promise { this.logger.info( `Adding content from ${node.filepath} to parent ${parentObjectType}` @@ -292,51 +311,104 @@ export class SynchronizeMarkdownToNotion { } } - if (parentObjectType === 'unknown') { - throw new Error('Parent object type is unknown'); - } - - if (parentObjectType === 'page') { - // If clean sync is enabled, delete all existing content first - if (cleanSync) { - this.logger.info('Clean sync enabled - removing existing content'); - try { - await this.destinationRepository.deleteChildBlocks({ - parentPageId: parentObjectId, - }); - this.logger.info('Successfully removed existing content'); - } catch (error) { - this.logger.warn( - 'Failed to remove existing content, continuing with sync', - { error } - ); - } - - const newPage = await this.destinationRepository.createPage({ + switch (parentObjectType) { + case 'page': + return await this.synchronizeRootNodeWithParentPage({ + node, + parentObjectId, + parentObjectType, pageElement, + lockPage, + cleanSync, + flatten, + }); + case 'database': + return await this.synchronizeRootNodeWithParentDatabase({ + node, parentObjectId, parentObjectType, + pageElement, + cleanSync, }); + case 'unknown': + default: + throw new Error(`Invalid parent object type: ${parentObjectType}`); + } + } - pageElement.id = newPage.pageId; - - return { page: pageElement, treeNodeId: node.id }; + private async synchronizeRootNodeWithParentPage({ + node, + parentObjectId, + parentObjectType, + pageElement, + lockPage, + cleanSync, + flatten, + }: { + node: TreeNode; + parentObjectId: string; + parentObjectType: ObjectType; + pageElement: PageElement; + lockPage: boolean; + cleanSync: boolean; + flatten: boolean; + }): Promise { + if (cleanSync) { + this.logger.info('Clean sync enabled - removing existing content'); + try { + await this.destinationRepository.deleteChildBlocks({ + parentPageId: parentObjectId, + }); + this.logger.info('Successfully removed existing content'); + } catch (error) { + this.logger.warn( + 'Failed to remove existing content, continuing with sync', + { error } + ); } + } - const updatedPage = await this.destinationRepository.updatePage({ - pageId: parentObjectId, + if (flatten) { + const newPage = await this.destinationRepository.createPage({ pageElement, + parentObjectId, + parentObjectType, }); - pageElement.id = updatedPage.pageId; - - this.logger.info(`Updated parent page ${parentObjectId}`); - - await this.lockPageIfNeeded(parentObjectId, lockPage); + pageElement.id = newPage.pageId; return { page: pageElement, treeNodeId: node.id }; } + const updatedPage = await this.destinationRepository.updatePage({ + pageId: parentObjectId, + pageElement, + }); + + pageElement.id = updatedPage.pageId; + + this.logger.info(`Updated parent page ${parentObjectId}`); + + await this.lockPageIfNeeded(parentObjectId, lockPage); + + return { page: pageElement, treeNodeId: node.id }; + } + + private async synchronizeRootNodeWithParentDatabase({ + node, + parentObjectId, + parentObjectType, + pageElement, + cleanSync, + }: { + node: TreeNode; + parentObjectId: string; + parentObjectType: ObjectType; + pageElement: PageElement; + cleanSync: boolean; + }): Promise { + // If clean sync is enabled, delete all existing content first + if (cleanSync) { await this.destinationRepository.deleteChildBlocks({ parentPageId: parentObjectId, @@ -415,15 +487,17 @@ export class SynchronizeMarkdownToNotion { }); // Recursively process children - for (const grandChild of childNode.children) { - const grandChildSyncResult = await this.synchronizeChildNode({ - childNode: grandChild, - parentPageId: pageElement.id, - lockPage, - forceNew, - }); - syncResult.push(...grandChildSyncResult); - } + await Promise.all( + childNode.children.map(async (grandChild) => { + const grandChildSyncResult = await this.synchronizeChildNode({ + childNode: grandChild, + parentPageId: pageElement.id!, + lockPage, + forceNew, + }); + syncResult.push(...grandChildSyncResult); + }) + ); await this.lockPageIfNeeded(pageElement.id, lockPage); diff --git a/sync/index.js b/sync/index.js index 67c44b4..ed98cf3 100644 --- a/sync/index.js +++ b/sync/index.js @@ -75020,7 +75020,7 @@ class MkNotes { /** * Synchronize a markdown file to Notion */ - async synchronizeMarkdownToNotionFromFileSystem({ inputPath, parentNotionPageId, cleanSync = false, lockPage = false, saveId = false, forceNew = false, }) { + async synchronizeMarkdownToNotionFromFileSystem({ inputPath, parentNotionPageId, cleanSync = false, lockPage = false, saveId = false, forceNew = false, flatten = false, }) { const synchronizeMarkdownToNotion = new domains_1.SynchronizeMarkdownToNotion({ logger: this.logger, destinationRepository: this.infrastructureInstances.notionDestination, @@ -75034,6 +75034,7 @@ class MkNotes { lockPage, saveId, forceNew, + flatten, }); } } @@ -75061,6 +75062,7 @@ var Inputs; Inputs["Lock"] = "lock"; Inputs["SaveId"] = "save-id"; Inputs["ForceNew"] = "force-new"; + Inputs["Flat"] = "flat"; })(Inputs || (Inputs = {})); const sync = async (earlyExit = false) => { try { @@ -75071,6 +75073,7 @@ const sync = async (earlyExit = false) => { const lock = (0, utils_1.getInputAsBool)(Inputs.Lock) ?? false; const saveId = (0, utils_1.getInputAsBool)(Inputs.SaveId); const forceNew = (0, utils_1.getInputAsBool)(Inputs.ForceNew); + const flat = (0, utils_1.getInputAsBool)(Inputs.Flat); const mkNotes = new MkNotes_1.MkNotes({ notionApiKey, }); @@ -75081,6 +75084,7 @@ const sync = async (earlyExit = false) => { lockPage: lock, saveId: saveId, forceNew: forceNew, + flatten: flat, }); // node will stay alive if any promises are not resolved, // which is a possibility if HTTP requests are dangling @@ -76183,6 +76187,27 @@ class SiteMap { this.removeUselessNodesTree(this._root); this.traverseAndUpdate(this._root); } + /** + * Returns a new SiteMap instance with the tree flattened under the root node + */ + flatten() { + const flattenedSiteMap = new SiteMap(); + const flattenedChildren = this._root.flatten(); + // Create a copy of the original root node WITHOUT its children + // (just the root node itself, not its nested structure) + const rootCopy = new TreeNode_1.TreeNode({ + id: this._root.id, + name: this._root.name, + filepath: this._root.filepath, + children: [], + parent: flattenedSiteMap._root, + }); + // Create deep copies of all flattened children and set their parent to the new root + const flattenedChildrenCopies = flattenedChildren.map((child) => TreeNode_1.TreeNode.fromJSON(child.toJSON(), flattenedSiteMap._root)); + // Set children: original root copy (without nested children) first, then all flattened descendants copies + flattenedSiteMap._root.children = [rootCopy, ...flattenedChildrenCopies]; + return flattenedSiteMap; + } /** * TODO: Implement mkdocs.yaml sitemap parsing * @@ -76262,6 +76287,17 @@ class TreeNode { node.children = json.children.map((child) => this.fromJSON(child, node)); return node; } + flatten() { + // Recursively flatten all children + const flattenedChildren = this.children.reduce((acc, child) => acc.concat(child.flatten()), []); + // If the node is the root node, return only the flattened children + // (the root itself is not included as it's already in the SiteMap) + if (this.parent === null) { + return flattenedChildren; + } + // For non-root nodes, include this node followed by its flattened children + return [this, ...flattenedChildren]; + } } exports.TreeNode = TreeNode; @@ -76425,7 +76461,7 @@ class PreviewSynchronization { constructor(params) { this.sourceRepository = params.sourceRepository; } - async execute(args, { format } = {}) { + async execute(args, { format, flatten } = {}) { // Check if the source repository is accessible try { await this.sourceRepository.sourceIsAccessible(args); @@ -76452,7 +76488,10 @@ class PreviewSynchronization { } } const filePaths = await this.sourceRepository.getFilePathList(args); - const siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + let siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + if (flatten) { + siteMap = siteMap.flatten(); + } return sitemapSerializer(siteMap); } } @@ -76482,7 +76521,7 @@ class SynchronizeMarkdownToNotion { this.logger = params.logger; } async execute(args) { - const { notionParentPageUrl, cleanSync, lockPage, saveId, forceNew, ...others } = args; + const { notionParentPageUrl, cleanSync, lockPage, saveId, forceNew, flatten, ...others } = args; const notionObjectId = this.destinationRepository.getObjectIdFromObjectUrl({ objectUrl: notionParentPageUrl, }); @@ -76511,7 +76550,10 @@ class SynchronizeMarkdownToNotion { try { this.logger.info('Starting synchronization process'); const filePaths = await this.sourceRepository.getFilePathList(others); - const siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + let siteMap = sitemap_1.SiteMap.buildFromFilePaths(filePaths); + if (flatten) { + siteMap = siteMap.flatten(); + } // Traverse the SiteMap and synchronize files const pages = await this.synchronizeTreeNode({ node: siteMap.root, @@ -76520,6 +76562,7 @@ class SynchronizeMarkdownToNotion { lockPage, cleanSync, forceNew, + flatten, }); this.logger.info('Synchronization process completed successfully'); if (saveId) { @@ -76580,24 +76623,29 @@ class SynchronizeMarkdownToNotion { /** * Main orchestrator for synchronizing a tree node and its children */ - async synchronizeTreeNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, }) { + async synchronizeTreeNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, flatten, }) { this.validateParentObjectType(parentObjectType); const nodeToSync = this.getNodeToSynchronize(node, parentObjectType); const results = []; - const { page: rootPageElement, treeNodeId: rootTreeNodeId } = await this.synchronizeRootNode({ - node: nodeToSync, - parentObjectId, - parentObjectType, - lockPage, - cleanSync, - forceNew, - }); - results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + let rootPageElement; + // If not flattening, synchronize the root node + if (!flatten) { + const { page: rootPageElement, treeNodeId: rootTreeNodeId } = await this.synchronizeRootNode({ + node: nodeToSync, + parentObjectId, + parentObjectType, + lockPage, + cleanSync, + forceNew, + flatten, + }); + results.push({ page: rootPageElement, treeNodeId: rootTreeNodeId }); + } for (const childNode of node.children) { try { const childResults = await this.synchronizeChildNode({ childNode, - parentPageId: rootPageElement.id, + parentPageId: rootPageElement?.id ?? parentObjectId, lockPage, forceNew, }); @@ -76616,7 +76664,7 @@ class SynchronizeMarkdownToNotion { * Synchronizes the root node to the parent object (page or database) * Returns the page ID to use as parent for child nodes */ - async synchronizeRootNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, }) { + async synchronizeRootNode({ node, parentObjectId, parentObjectType, lockPage, cleanSync, forceNew, flatten, }) { this.logger.info(`Adding content from ${node.filepath} to parent ${parentObjectType}`); const pageElement = await this.fetchAndConvertToPageElement(node.filepath, { forceNew, @@ -76636,39 +76684,63 @@ class SynchronizeMarkdownToNotion { }; } } - if (parentObjectType === 'unknown') { - throw new Error('Parent object type is unknown'); - } - if (parentObjectType === 'page') { - // If clean sync is enabled, delete all existing content first - if (cleanSync) { - this.logger.info('Clean sync enabled - removing existing content'); - try { - await this.destinationRepository.deleteChildBlocks({ - parentPageId: parentObjectId, - }); - this.logger.info('Successfully removed existing content'); - } - catch (error) { - this.logger.warn('Failed to remove existing content, continuing with sync', { error }); - } - const newPage = await this.destinationRepository.createPage({ + switch (parentObjectType) { + case 'page': + return await this.synchronizeRootNodeWithParentPage({ + node, + parentObjectId, + parentObjectType, pageElement, + lockPage, + cleanSync, + flatten, + }); + case 'database': + return await this.synchronizeRootNodeWithParentDatabase({ + node, parentObjectId, parentObjectType, + pageElement, + cleanSync, }); - pageElement.id = newPage.pageId; - return { page: pageElement, treeNodeId: node.id }; + case 'unknown': + default: + throw new Error(`Invalid parent object type: ${parentObjectType}`); + } + } + async synchronizeRootNodeWithParentPage({ node, parentObjectId, parentObjectType, pageElement, lockPage, cleanSync, flatten, }) { + if (cleanSync) { + this.logger.info('Clean sync enabled - removing existing content'); + try { + await this.destinationRepository.deleteChildBlocks({ + parentPageId: parentObjectId, + }); + this.logger.info('Successfully removed existing content'); + } + catch (error) { + this.logger.warn('Failed to remove existing content, continuing with sync', { error }); } - const updatedPage = await this.destinationRepository.updatePage({ - pageId: parentObjectId, + } + if (flatten) { + const newPage = await this.destinationRepository.createPage({ pageElement, + parentObjectId, + parentObjectType, }); - pageElement.id = updatedPage.pageId; - this.logger.info(`Updated parent page ${parentObjectId}`); - await this.lockPageIfNeeded(parentObjectId, lockPage); + pageElement.id = newPage.pageId; return { page: pageElement, treeNodeId: node.id }; } + const updatedPage = await this.destinationRepository.updatePage({ + pageId: parentObjectId, + pageElement, + }); + pageElement.id = updatedPage.pageId; + this.logger.info(`Updated parent page ${parentObjectId}`); + await this.lockPageIfNeeded(parentObjectId, lockPage); + return { page: pageElement, treeNodeId: node.id }; + } + async synchronizeRootNodeWithParentDatabase({ node, parentObjectId, parentObjectType, pageElement, cleanSync, }) { + // If clean sync is enabled, delete all existing content first if (cleanSync) { await this.destinationRepository.deleteChildBlocks({ parentPageId: parentObjectId, @@ -76725,7 +76797,7 @@ class SynchronizeMarkdownToNotion { treeNodeId: childNode.id, }); // Recursively process children - for (const grandChild of childNode.children) { + await Promise.all(childNode.children.map(async (grandChild) => { const grandChildSyncResult = await this.synchronizeChildNode({ childNode: grandChild, parentPageId: pageElement.id, @@ -76733,7 +76805,7 @@ class SynchronizeMarkdownToNotion { forceNew, }); syncResult.push(...grandChildSyncResult); - } + })); await this.lockPageIfNeeded(pageElement.id, lockPage); return syncResult; }